Using GitHub Copilot for Test Generation in Python: An Empirical Study

Khalid El Haji*, Carolin Brandt, Andy Zaidman

*Corresponding author for this work

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

Abstract

Writing unit tests is a crucial task in software development, but it is also recognized as a time-consuming and tedious task. As such, numerous test generation approaches have been proposed and investigated. However, most of these test generation tools produce tests that are typically difficult to understand. Recently, Large Language Models (LLMs) have shown promising results in generating source code and supporting software engineering tasks. As such, we investigate the usability of tests generated by GitHub Copilot, a proprietary closed-source code generation tool that uses an LLM. We evaluate GitHub Copilot's test generation abilities both within and without an existing test suite, and we study the impact of different code commenting strategies on test generations.Our investigation evaluates the usability of 290 tests generated by GitHub Copilot for 53 sampled tests from open source projects. Our findings highlight that within an existing test suite, approximately 45.28% of the tests generated by Copilot are passing tests; 54.72% of generated tests are failing, broken, or empty tests. Furthermore, if we generate tests using Copilot without an existing test suite in place, we observe that 92.45% of the tests are failing, broken, or empty tests. Additionally, we study how test method comments influence the usability of test generations.

Original languageEnglish
Title of host publicationProceedings - 2024 IEEE/ACM International Conference on Automation of Software Test, AST 2024
PublisherAssociation for Computing Machinery (ACM)
Pages45-55
Number of pages11
ISBN (Electronic)9798400705885
DOIs
Publication statusPublished - 2024
Event5th ACM/IEEE International Conference on Automation of Software Test, AST 2024, co-located with the 46th International Conference on Software Engineering, ICSE 2024 - Lisbon, Portugal
Duration: 15 Apr 202416 Apr 2024

Publication series

NameProceedings - 2024 IEEE/ACM International Conference on Automation of Software Test, AST 2024

Conference

Conference5th ACM/IEEE International Conference on Automation of Software Test, AST 2024, co-located with the 46th International Conference on Software Engineering, ICSE 2024
Country/TerritoryPortugal
CityLisbon
Period15/04/2416/04/24

Fingerprint

Dive into the research topics of 'Using GitHub Copilot for Test Generation in Python: An Empirical Study'. Together they form a unique fingerprint.

Cite this