Optimizing Hyperparameters in Meta-Learning for Enhanced Image Classification

A.M. Vincent, P. Jidesh, A. A. Bini

Research output: Contribution to journalArticleScientificpeer-review

Abstract

This paper investigates the significance of hyperparameter optimization in meta-learning for image classification tasks. Despite advancements in deep learning, real-time image classification applications often suffer from data inadequacy. Few-shot learning addresses this challenge by enabling learning from limited samples. Meta-learning, a prominent tool for few-shot learning, learns across multiple classification tasks. We explore different types of meta-learners, with a particular focus on metric-based models. We analyze the potential of hyperparameter optimization techniques, specifically Bayesian optimization and its variants, to enhance the performance of these models. Experimental results on the Omniglot and ImageNet datasets demonstrate that incorporating Bayesian optimization, particularly its evolutionary strategy variant, into meta-learning frameworks leads to improved accuracy compared to settings without hyperparameter optimization. Here, we show that by optimizing hyperparameters for individual tasks rather than using a uniform setting, we achieve notable gains in model performance, underscoring the importance of tailored hyperparameter configurations in meta-learning.

Original languageEnglish
Pages (from-to)130816-130831
Number of pages16
JournalIEEE Access
Volume13
DOIs
Publication statusPublished - 2025
Externally publishedYes

Fingerprint

Dive into the research topics of 'Optimizing Hyperparameters in Meta-Learning for Enhanced Image Classification'. Together they form a unique fingerprint.

Cite this