Distill The Image to Nowhere: Inversion Knowledge Distillation for Multimodal Machine Translation R Peng, Y Zeng, J Zhao EMNLP, 2022 | 13 | 2022 |
Neural machine translation with attention based on a new syntactic branch distance R Peng, Z Chen, T Hao, Y Fang CCMT, Best Paper Candidates, 2019 | 8 | 2019 |
Syntax-aware neural machine translation directed by syntactic dependency degree R Peng, T Hao, Y Fang Neural Computing and Applications, 2021 | 7 | 2021 |
Deps-SAN: Neural Machine Translation with Dependency-Scaled Self-Attention Network R Peng, N Lin, Y Fang, S Jiang, T Hao, B Chen, J Zhao ICONIP, 2021 | 7 | 2021 |
Energy-based Automated Model Evaluation R Peng, H Zou, H Wang, Y Zeng, Z Huang, J Zhao ICLR, 2024 | 5 | 2024 |
CAME: Contrastive Automated Model Evaluation R Peng, Q Duan, H Wang, J Ma, Y Jiang, Y Tu, X Jiang, J Zhao ICCV, 2023 | 3 | 2023 |
HybridVocab: Towards Multi-Modal Machine Translation via Multi-Aspect Alignment R Peng, Y Zeng, J Zhao ICMR, 2022 | 1 | 2022 |
Better Sign Language Translation with Monolingual Data R Peng, Y Zeng, J Zhao arXiv.preprint, 2023 | | 2023 |