Local Search is a Remarkably Strong Baseline for Neural Architecture Search

Tom Den Ottelander, Arkadiy Dushatskiy, Marco Virgolin, Peter A.N. Bosman

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

9 Citations (Scopus)

Abstract

Neural Architecture Search (NAS), i.e., the automation of neural network design, has gained much popularity in recent years with increasingly complex search algorithms being proposed. Yet, solid comparisons with simple baselines are often missing. At the same time, recent retrospective studies have found many new algorithms to be no better than random search (RS). In this work we consider the use of a simple Local Search (LS) algorithm for NAS. We particularly consider a multi-objective NAS formulation, with network accuracy and network complexity as two objectives, as understanding the trade-off between these two objectives is arguably among the most interesting aspects of NAS. The proposed LS algorithm is compared with RS and two evolutionary algorithms (EAs), as these are often heralded as being ideal for multi-objective optimization. To promote reproducibility, we create and release two benchmark datasets, named MacroNAS-C10 and -C100, containing 200K saved network evaluations for two established image classification tasks, CIFAR-10 and CIFAR-100. Our benchmarks are designed to be complementary to existing benchmarks, especially in that they are better suited for multi-objective search. We additionally consider a version of the problem with a much larger architecture space. While we find and show that the considered algorithms explore the search space in fundamentally different ways, we also find that LS substantially outperforms RS and even performs nearly as good as state-of-the-art EAs. We believe that this provides strong evidence that LS is truly a competitive baseline for NAS against which new NAS algorithms should be benchmarked.

Original languageEnglish
Title of host publicationEvolutionary Multi-Criterion Optimization
Subtitle of host publication11th International Conference, EMO 2021, Proceedings
EditorsHisao Ishibuchi, Qingfu Zhang, Ran Cheng, Ke Li, Hui Li, Handing Wang, Aimin Zhou
Place of PublicationCham
PublisherSpringer
Pages465-479
Number of pages15
ISBN (Electronic)978-3-030-72062-9
ISBN (Print)978-3-030-72061-2
DOIs
Publication statusPublished - 2021
Event11th International Conference on Evolutionary Multi-Criterion Optimization, EMO 2021 - Shenzhen, China
Duration: 28 Mar 202131 Mar 2021

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
PublisherSpringer
Volume12654
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference11th International Conference on Evolutionary Multi-Criterion Optimization, EMO 2021
Country/TerritoryChina
CityShenzhen
Period28/03/2131/03/21

Keywords

  • Evolutionary algorithm
  • Local Search
  • Multi-objective NAS
  • NAS baseline
  • Neural Architecture Search
  • Random search

Fingerprint

Dive into the research topics of 'Local Search is a Remarkably Strong Baseline for Neural Architecture Search'. Together they form a unique fingerprint.

Cite this