追蹤
Ting-Rui Chiang
Ting-Rui Chiang
在 usc.edu 的電子郵件地址已通過驗證 - 首頁
標題
引用次數
引用次數
年份
Semantically-Aligned Equation Generation for Solving and Reasoning Math Word Problems
TR Chiang, YN Chen
The 17th Annual Conference of the North American Chapter of the Association …, 2019
1022019
Relating Neural Text Degeneration to Exposure Bias
TR Chiang, YN Chen
The Fourth BlackboxNLP Workshop on Analyzing and Interpreting Neural …, 2021
172021
Breaking Down Multilingual Machine Translation
TR Chiang, YP Chen, YT Yeh, G Neubig
Findings of the Association for Computational Linguistics: ACL 2022, 2022
122022
Learning multi-level information for dialogue response selection by highway recurrent transformer
TR Chiang, CW Huang, SY Su, YN Chen
Computer Speech & Language 63, 101073, 2020
82020
An Empirical Study of Content Understanding in Conversational Question Answering
TR Chiang, HT Ye, YN Chen
The Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI 2020), 2020
82020
RAP-Net: Recurrent attention pooling networks for dialogue response selection
CW Huang, TR Chiang, SY Su, YN Chen
Computer Speech & Language 63, 101079, 2020
52020
On a Benefit of Masked Language Model Pretraining: Robustness to Simplicity Bias
TR Chiang
Proceedings of the 13th International Joint Conference on Natural Language …, 2023
3*2023
Improving Dialogue State Tracking by Joint Slot Modeling
TR Chiang, YT Yeh
The 3rd Workshop on Natural Language Processing for Conversational AI at the …, 2021
32021
Are you doing what I say? On modalities alignment in ALFRED
TR Chiang, YT Yeh, TC Chi, YS Wang
The 1st Workshop on Novel Ideas in Learning-to-Learn through Interaction at …, 2021
12021
Understanding In-Context Learning with a Pelican Soup Framework
TR Chiang, D Yogatama
arXiv preprint arXiv:2402.10424, 2024
2024
On Retrieval Augmentation and the Limitations of Language Model Training
TR Chiang, XV Yu, J Robinson, O Liu, I Lee, D Yogatama
arXiv preprint arXiv:2311.09615, 2023
2023
The Distributional Hypothesis Does Not Fully Explain the Benefits of Masked Language Model Pretraining
TR Chiang, D Yogatama
Proceedings of the 2023 Conference on Empirical Methods in Natural Language …, 2023
2023
Why Can You Lay Off Heads? Investigating How BERT Heads Transfer
TR Chiang, YN Chen
arXiv preprint arXiv:2106.07137, 2021
2021
系統目前無法執行作業,請稍後再試。
文章 1–13