Research: Korean linguistics features through Transformer-based Language Model ExplainabilityResearch interests: Korean linguistics, Computational Linguistics, Natural Language Processing, Deep Learning.
My research is centered around transformers and their self-attention mechanism, with the goal of understanding their inner workings and the importance of self-attention in achieving high performance. Additionally, I have a strong interest in Korean linguistics, which has helped me gain a deeper understanding of the relationship between transformer architecture and textual features. I'm also investigating how large language models represent out-of-context words and the extent to which they rely on context to capture their semantics. Through our experiments, we found that unexpected tokens can cause the model to attend less to the information coming from themselves to compute their representations, particularly at higher layers, which has important implications for assessing the robustness of LLMs in real-world scenarios.
11/2020— present: PhD candidate at Istituto Italiano di Studi Orientali (ISO), Sapienza University of Rome
Research project's title: "Korean linguistics features through Transformer-based Language Model Explainability"
Advisor: Professor Antonetta L. Bruno (Sapienza University of Rome, ISO)
Co-advisor: Professor Fabrizio Silvestri (Sapienza University of Rome, DIAG)
2016 - 2018, Master’s degree in East Asian Languages and civilizations
at Sapienza University of Rome (Italy) and Hanyang University of Soul (South Korea)
Korean language curriculum
Final grade: 110 cum Laude
Dissertation's title: "Emotional expressions in contemporary Korea: The case of Webtoon"
2015 - 2016, Exchange program focusing on Korean language
at Hankuk University of Foreign Studies (Seoul, South Korea)
2013 - 2016, bachelor’s degree in East Asian Languages and civilizations
at Sapienza University of Rome (Italy)
Korean language curriculum
Final grade: 102
Dissertation's title: "Tourism phenomenon in South Korea: economic relevance and government's policies"
2019-2022 I have worked as linguist at Babelscape srl
Papers:
Valeria Ruscio, Valentino Maiorca, Fabrizio Silvestri; "
Attention-likelihood relationship in transformers", 2023