Hu, V. T., Wu, D., Asano, Y. M., Mettes, P., Fernández-Méndez, F., Ommer, B., & Snoek, C. G. M. (2024). Flow Matching for Conditional Text Generation in a Few Sampling Steps. In Y. Graham, & M. Purver (Eds.), The 18th Conference of the European Chapter of the Association for Computational Linguistics: proceedings of the conference : EACL 2024 : March 17-22, 2024 (Vol. 2, pp. 380-392). Association for Computational Linguistics. https://doi.org/10.18653/v1/2024.eacl-short.33[details]
Lei, Y., Wu, D., Zhou, T., Shen, T., Cao, Y., Tao, C., & Yates, A. (2024). Meta-Task Prompting Elicits Embeddings from Large Language Models. In L.-W. Ku, A. Martins, & V. Srikumar (Eds.), The 62nd Annual Meeting of the Association for Computational Linguistics (ACL 2024) : proceedings of the conference: ACL 2024 : August 11-16, 2024 (Vol. 1, pp. 10141-10157). Association for Computational Linguistics. https://doi.org/10.18653/v1/2024.acl-long.546[details]
Tan, S., Wu, D., & Monz, C. (2024). Neuron Specialization: Leveraging Intrinsic Task Modularity for Multilingual Machine Translation. In Y. Al-Onaizan, M. Bansal, & Y.-N. Chen (Eds.), The 2024 Conference on Empirical Methods in Natural Language Processing : Proceedings of the Conference: EMNLP 2024 : November 12-16, 2024 (pp. 6506-6527). Association for Computational Linguistics. https://doi.org/10.18653/v1/2024.emnlp-main.374[details]
Tan, S., Wu, D., Stap, D., Aycock, S., & Monz, C. (2024). UvA-MT’s Participation in the WMT24 General Translation Shared Task. In B. Haddow, T. Kocmi, P. Koehn, & C. Monz (Eds.), Ninth Conference on Machine Translation : Proceedings of the Conference: WMT 2024 : November 15-16, 2024 (pp. 176-184). Association for Computational Linguistics. https://doi.org/10.18653/v1/2024.wmt-1.11[details]
Wu, D., Lei, Y., Yates, A., & Monz, C. (2024). Representational Isomorphism and Alignment of Multilingual Large Language Models. In J. Sälevä, & A. Owodunni (Eds.), The 4th Workshop on Multilingual Representation Learning : proceedings of the workshop: MRL 2024 : November 16, 2024 (pp. 293-297). Association for Computational Linguistics. https://doi.org/10.18653/v1/2024.mrl-1.24[details]
Wu, D., Lei, Y., Yates, A., & Monz, C. (2024). Representational Isomorphism and Alignment of Multilingual Large Language Models. In Y. Al-Onaizan, M. Bansal, & Y.-N. Chen (Eds.), The 2024 Conference on Empirical Methods in Natural Language Processing : Findings of EMNLP 2024: EMNLP 2024 : November 12-16, 2024 (pp. 14074-14085). Association for Computational Linguistics. https://doi.org/10.18653/v1/2024.findings-emnlp.823[details]
Wu, D., Tan, S., Meng, Y., Stap, D., & Monz, C. (2024). How Far can 100 Samples Go? Unlocking Zero-Shot Translation with Tiny Multi-Parallel Data. In L.-W. Ku, A. Martins, & V. Srikumar (Eds.), The 62nd Annual Meeting of the Association for Computational Linguistics : Findings of the Association for Computational Linguistics: ACL 2024: ACL 2024 : August 11-16, 2024 (pp. 15092-15108). Association for Computational Linguistics. https://doi.org/10.18653/v1/2024.findings-acl.896[details]
Wu, D., & Monz, C. (2023). Beyond Shared Vocabulary: Increasing Representational Word Similarities across Languages for Multilingual Machine Translation. In H. Bouamor, J. Pino, & K. Bali (Eds.), The 2023 Conference on Empirical Methods in Natural Language Processing: EMNLP 2023 : Proceedings of the Conference : December 6-10, 2023 (pp. 9749–9764). Association for Computational Linguistics. https://doi.org/10.18653/v1/2023.emnlp-main.605[details]
Wu, D., Tan, S., Stap, D., Araabi, A., & Monz, C. (2023). UvA-MT’s Participation in the WMT 2023 General Translation Shared Task. In P. Koehn, B. Haddow, T. Kocmi, & C. Monz (Eds.), Eighth Conference on Machine Translation: WMT 2023 : December 6-7, 2023 (pp. 175–180). Association for Computational Linguistics. https://doi.org/10.48550/arXiv.2310.09946, https://doi.org/10.18653/v1/2023.wmt-1.17[details]
Aycock, S. J. S., Stap, D., Wu, D., Monz, C., & Sima'an, K. (2025). Can LLMs Really Learn to Translate a Low-Resource Language from One Grammar Book?. https://openreview.net/forum?id=aMBSY2ebPw
De UvA gebruikt cookies voor het meten, optimaliseren en goed laten functioneren van de website. Ook worden er cookies geplaatst om inhoud van derden te kunnen tonen en voor marketingdoeleinden. Klik op ‘Accepteren’ om akkoord te gaan met het plaatsen van alle cookies. Of kies voor ‘Weigeren’ om alleen functionele en analytische cookies te accepteren. Je kunt je voorkeur op ieder moment wijzigen door op de link ‘Cookie instellingen’ te klikken die je onderaan iedere pagina vindt. Lees ook het UvA Privacy statement.