Programming Language Models in Multilingual Settings

Jonathan Katzy*

*Corresponding author for this work

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

7 Downloads (Pure)


Large language models have become increasingly utilized in programming contexts. However, due to the recent emergence of this trend, some aspects have been overlooked. We propose a research approach that investigates the inner mechanics of transformer networks, on a neuron, layer, and output representation level, to understand whether there is a theoretical limitation that prevents large language models from performing optimally in a multilingual setting.We propose to approach the investigation into the theoretical limitations, by addressing open problems in machine learning for the software engineering community. This will contribute to a greater understanding of large language models for programming-related tasks, making the findings more approachable to practitioners, and simply their implementation in future models.

Original languageEnglish
Title of host publicationProceedings - 2024 ACM/IEEE 46th International Conference on Software Engineering
Subtitle of host publicationCompanion, ICSE-Companion 2024
Number of pages3
ISBN (Electronic)9798400705021
Publication statusPublished - 2024
Event46th International Conference on Software Engineering: Companion, ICSE-Companion 2024 - Lisbon, Portugal
Duration: 14 Apr 202420 Apr 2024

Publication series

NameProceedings - International Conference on Software Engineering
ISSN (Print)0270-5257


Conference46th International Conference on Software Engineering: Companion, ICSE-Companion 2024


  • Code Completion
  • Explainable AI
  • Large Language Models
  • Multilingual
  • Programming Languages
  • Software Engineering


Dive into the research topics of 'Programming Language Models in Multilingual Settings'. Together they form a unique fingerprint.

Cite this