NEO: NEuro-Inspired Optimization—A Fractional Time Series Approach

Sarthak Chatterjee*, Subhro Das, Sérgio Pequito

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

34 Downloads (Pure)

Abstract

Solving optimization problems is a recurrent theme across different fields, including large-scale machine learning systems and deep learning. Often in practical applications, we encounter objective functions where the Hessian is ill-conditioned, which precludes us from using optimization algorithms utilizing second-order information. In this paper, we propose to use fractional time series analysis methods that have successfully been used to model neurophysiological processes in order to circumvent this issue. In particular, the long memory property of fractional time series exhibiting non-exponential power-law decay of trajectories seems to model behavior associated with the local curvature of the objective function at a given point. Specifically, we propose a NEuro-inspired Optimization (NEO) method that leverages this behavior, which contrasts with the short memory characteristics of currently used methods (e.g., gradient descent and heavy-ball). We provide evidence of the efficacy of the proposed method on a wide variety of settings implicitly found in practice.

Original languageEnglish
Article number724044
Number of pages11
JournalFrontiers in Physiology
Volume12
DOIs
Publication statusPublished - 2021

Keywords

  • fractional calculus
  • iterative optimization algorithms
  • long memory time series
  • optimization
  • time series processes

Fingerprint

Dive into the research topics of 'NEO: NEuro-Inspired Optimization—A Fractional Time Series Approach'. Together they form a unique fingerprint.

Cite this