A Hybrid Recursive Implementation of Broad Learning With Incremental Features

Di Liu, Simone Baldi, Wenwu Yu, C. L.P. Chen

Research output: Contribution to journalArticleScientificpeer-review

Abstract

The broad learning system (BLS) paradigm has recently emerged as a computationally efficient approach to supervised learning. Its efficiency arises from a learning mechanism based on the method of least-squares. However, the need for storing and inverting large matrices can put the efficiency of such mechanism at risk in big-data scenarios. In this work, we propose a new implementation of BLS in which the need for storing and inverting large matrices is avoided. The distinguishing features of the designed learning mechanism are as follows: 1) the training process can balance between efficient usage of memory and required iterations (hybrid recursive learning) and 2) retraining is avoided when the network is expanded (incremental learning). It is shown that, while the proposed framework is equivalent to the standard BLS in terms of trained network weights,much larger networks than the standard BLS can be smoothly trained by the proposed solution, projecting BLS toward the big-data frontier.

Original languageEnglish
Number of pages13
JournalIEEE Transactions on Neural Networks and Learning Systems
DOIs
Publication statusAccepted/In press - 2021

Keywords

  • Big data
  • broad learning system (BLS)
  • recursive learning
  • training time.

Fingerprint

Dive into the research topics of 'A Hybrid Recursive Implementation of Broad Learning With Incremental Features'. Together they form a unique fingerprint.

Cite this