For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.
Choenni, R. M. V. K., Shutova, E. V., & Garrette, D. (2024). Examining Modularity in Multilingual LMs via Language-Specialized Subnetworks. In Examining Modularity in Multilingual LMs via Language-Specialized Subnetworks The Association for Computational Linguistics.
2023
Choenni, R. M. V. K., Shutova, E. V., & Garrette, D. (2023). How do languages influence each other? Studying cross-lingual data sharing during LM fine-tuning. In How do languages influence each other? Studying cross-lingual data sharing during LM fine-tuning (pp. 13244-13257). Association for Computational Linguistics (ACL).
Choenni, R., Garrette, D., & Shutova, E. (2023). Cross-Lingual Transfer with Language-Specific Subnetworks for Low-Resource Dependency Parsing. Computational Linguistics, 49(3), 613-641. https://doi.org/10.1162/coli_a_00482[details]
Starace, G., Papakostas, K., Choenni, R., Panagiotopoulos, A., Rosati, M., Leidinger, A., & Shutova, E. (2023). Probing LLMs for Joint Encoding of Linguistic Categories. In Findings of the Association for Computational Linguistics: EMNLP 2023 (pp. 7158-7179). Association for Computational Linguistics (ACL).
2022
Choenni, R. M. V. K., & Shutova, E. V. (2022). Investigating language relationships in multilingual sentence encoders through the lens of linguistic typology. Computational Linguistics.
2021
Choenni, R., Shutova, E., & van Rooij, R. (2021). Stepmothers are mean and academics are pretentious: What do pretrained language models learn about you?'. In M-C. Moens, X. Huang, L. Specia, & S. W. Yih (Eds.), 2021 Conference on Empirical Methods in Natural Language Processing: EMNLP 2021 : proceedings of the conference : November 7-11, 2021 (pp. 1477-1491). The Association for Computational Linguistics. https://doi.org/10.18653/v1/2021.emnlp-main.111[details]
Abnar, S., Beinborn, L. M., Choenni, R. M. V. K., & Zuidema, W. H. (2019). Blackbox Meets Blackbox: Representational Similarity & Stability Analysis of Neural Language Models and Brains. In Proceedings of the 2019 ACL Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP (pp. 191-203). Association for Computational Linguistics (ACL).
Beinborn, L. M., Abnar, S., & Choenni, R. (in press). Robust Evaluation of Language-Brain Encoding Experiments. International Journal of Computational Linguistics and Applications. https://arxiv.org/abs/1904.02547
2024
Tong, X., Choenni, R. M. V. K., Lewis, M. A. F., & Shutova, E. V. (2024). Metaphor Understanding Challenge Dataset for LLMs. Advance online publication. https://doi.org/10.48550/arXiv.2403.11810
2019
Choenni, R., Hendrikx, E., & Beinborn, L. M. (2019). On the Evaluation of Structural Similarity between Brain and Computational Models. Poster session presented at Crossing the Boundaries: Language in Interaction, Nijnmegen.
2019
Beinborn, L. M., & Choenni, R. (2019). Semantic Drift in Multilingual Representations.
The UvA uses cookies to ensure the basic functionality of the site and for statistical and optimisation purposes. Cookies are also placed to display third-party content and for marketing purposes. Click 'Accept all cookies' to consent to the placement of all cookies, or choose 'Decline' to only accept functional and analytical cookies. Also read the UvA Privacy statement.