A restricted Boltzmann machine (RBM) learns a probability distribution over its input samples and has numerous uses like dimensionality reduction, classification and generative modeling. Conventional RBMs accept vectorized data that dismiss potentially important structural information in the original tensor (multi-way) input. Matrix-variate and tensor-variate RBMs, named MvRBM and TvRBM, have been proposed but are all restrictive by model construction and have weak model expression power. This work presents the matrix product operator RBM (MPORBM) that utilizes a tensor network generalization of Mv/TvRBM, preserves input formats in both the visible and hidden layers, and results in higher expressive power. A novel training algorithm integrating contrastive divergence and an alternating optimization procedure is also developed. Numerical experiments compare the MPORBM with the traditional RBM and MvRBM for data classification and image completion and denoising tasks. The expressive power of the MPORBM as a function of the MPO-rank is also investigated.
|Title of host publication||Proceedings of the 2019 International Joint Conference on Neural Networks (IJCNN 2019)|
|Place of Publication||Piscataway, NJ, USA|
|Number of pages||8|
|Publication status||Published - 2019|
|Event||IJCNN 2019: International Joint Conference on Neural Networks - Budapest, Hungary|
Duration: 14 Jul 2019 → 19 Jul 2019
|Conference||IJCNN 2019: International Joint Conference on Neural Networks|
|Period||14/07/19 → 19/07/19|
Bibliographical noteGreen Open Access added to TU Delft Institutional Repository ‘You share, we take care!’ – Taverne project https://www.openaccess.nl/en/you-share-we-take-care
Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.
- matrix product operators
- restricted Boltzmann machines