For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.
Li, X., Hughes, A., Llugiqi, M., Polat, F., Groth, P., & Ekaputra, F. J. (2023). Knowledge-centric Prompt Composition for Knowledge Base Construction from Pre-trained Language Models. In S. Razniewski, J.-C. Kalo, S. Singhania, & J. Z. Pan (Eds.), Joint proceedings of the 1st workshop on Knowledge Base Construction from Pre-Trained Language Models (KBC-LM) and the 2nd challenge on Language Models for Knowledge Base Construction (LM-KBC): co-located with the 22nd International Semantic Web Conference (ISWC 2023) : Athens, Greece, November 6, 2023 Article 3 (CEUR Workshop Proceedings; Vol. 3577). CEUR-WS. https://ceur-ws.org/Vol-3577/paper3.pdf[details]
Li, X., Polat, F., & Groth, P. (2023). Do Instruction-tuned Large Language Models Help with Relation Extraction? In S. Razniewski, J.-C. Kalo, S. Singhania, & J. Z. Pan (Eds.), Joint proceedings of the 1st workshop on Knowledge Base Construction from Pre-Trained Language Models (KBC-LM) and the 2nd challenge on Language Models for Knowledge Base Construction (LM-KBC): co-located with the 22nd International Semantic Web Conference (ISWC 2023) : Athens, Greece, November 6, 2023 Article 15 (CEUR Workshop Proceedings; Vol. 3577). CEUR-WS. https://ceur-ws.org/Vol-3577/paper15.pdf[details]
Yilmaz Polat, F. E., Groth, P. T., & Tiddi, I. (2023). Improving Graph-to-Text Generation Using Cycle Training. In Proceedings of the 4th Conference on Language, Data and Knowledge (pp. 256-261). ACL. https://aclanthology.org/2023.ldk-1.24
The UvA uses cookies to ensure the basic functionality of the site and for statistical and optimisation purposes. Cookies are also placed to display third-party content and for marketing purposes. Click 'Accept all cookies' to consent to the placement of all cookies, or choose 'Decline' to only accept functional and analytical cookies. Also read the UvA Privacy statement.