Uncovering Energy-Efficient Practices in Deep Learning Training: Preliminary Steps Towards Green AI

Tim Yarally*, Luís Cruz, Daniel Feitosa, June Sallou, Arie Van Deursen

*Corresponding author for this work

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

Abstract

Modern AI practices all strive towards the same goal: better results. In the context of deep learning, the term "results"often refers to the achieved accuracy on a competitive problem set. In this paper, we adopt an idea from the emerging field of Green AI to consider energy consumption as a metric of equal importance to accuracy and to reduce any irrelevant tasks or energy usage. We examine the training stage of the deep learning pipeline from a sustainability perspective, through the study of hyperparameter tuning strategies and the model complexity, two factors vastly impacting the overall pipeline's energy consumption. First, we investigate the effectiveness of grid search, random search and Bayesian optimisation during hyperparameter tuning, and we find that Bayesian optimisation significantly dominates the other strategies. Furthermore, we analyse the architecture of convolutional neural networks with the energy consumption of three prominent layer types: convolutional, linear and ReLU layers. The results show that convolutional layers are the most computationally expensive by a strong margin. Additionally, we observe diminishing returns in accuracy for more energy-hungry models. The overall energy consumption of training can be halved by reducing the network complexity. In conclusion, we highlight innovative and promising energy-efficient practices for training deep learning models. To expand the application of Green AI, we advocate for a shift in the design of deep learning models, by considering the trade-off between energy efficiency and accuracy.

Original languageEnglish
Title of host publicationProceedings - 2023 IEEE/ACM 2nd International Conference on AI Engineering - Software Engineering for AI, CAIN 2023
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages25-36
Number of pages12
ISBN (Electronic)979-8-3503-0113-7
DOIs
Publication statusPublished - 2023
Event2nd IEEE/ACM International Conference on AI Engineering - Software Engineering for AI, CAIN 2023 - Melbourne, Australia
Duration: 15 May 202316 May 2023

Publication series

NameProceedings - 2023 IEEE/ACM 2nd International Conference on AI Engineering - Software Engineering for AI, CAIN 2023

Conference

Conference2nd IEEE/ACM International Conference on AI Engineering - Software Engineering for AI, CAIN 2023
Country/TerritoryAustralia
CityMelbourne
Period15/05/2316/05/23

Bibliographical note

Green Open Access added to TU Delft Institutional Repository ‘You share, we take care!’ – Taverne project https://www.openaccess.nl/en/you-share-we-take-care Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public

Keywords

  • deep learning
  • green ai
  • green software
  • hyper-parameter tuning
  • network architecture

Fingerprint

Dive into the research topics of 'Uncovering Energy-Efficient Practices in Deep Learning Training: Preliminary Steps Towards Green AI'. Together they form a unique fingerprint.

Cite this