Abstract
This paper presents a method for approximate Gaussian process (GP) regression with tensor networks (TNs). A parametric approximation of a GP uses a linear combination of basis functions, where the accuracy of the approximation depends on the total number of basis functions M. We develop an approach that allows us to use an exponential amount of basis functions without the corresponding exponential computational complexity. The key idea to enable this is using low-rank TNs. We first find a suitable low-dimensional subspace from the data, described by a low-rank TN. In this low-dimensional subspace, we then infer the weights of our model by solving a Bayesian inference problem. Finally, we project the resulting weights back to the original space to make GP predictions. The benefit of our approach comes from the projection to a smaller subspace: It modifies the shape of the basis functions in a way that it sees fit based on the given data, and it allows for efficient computations in the smaller subspace. In an experiment with an 18-dimensional benchmark data set, we show the applicability of our method to an inverse dynamics problem.
Original language | English |
---|---|
Pages (from-to) | 7288-7293 |
Number of pages | 6 |
Journal | IFAC-PapersOnLine |
Volume | 56 |
Issue number | 2 |
DOIs | |
Publication status | Published - 2023 |
Event | 22nd IFAC World Congress - Yokohama, Japan Duration: 9 Jul 2023 → 14 Jul 2023 |
Keywords
- Gaussian process regression
- reduced-rank approximations
- tensor networks