Abstract
The Gaussian process (GP) is a simple yet powerful probabilistic framework for various machine learning tasks. However, exact algorithms for learning and prediction are prohibitive to be applied to large datasets due to inherent computational complexity. To overcome this main limitation, various techniques have been proposed, and in particular, local GP algorithms that scales”truly linearly” with respect to the dataset size. In this paper, we introduce a hierarchical model based on local GP for large-scale datasets, which stacks inducing points over inducing points in layers. By using different kernels in each layer, the overall model becomes multi-scale and is able to capture both long- and short-range dependencies. We demonstrate the effectiveness of our model by speed-accuracy performance on challenging real-world datasets.
Original language | English |
---|---|
Publication status | Published - 2017 |
Externally published | Yes |
Event | 20th International Conference on Artificial Intelligence and Statistics, AISTATS 2017 - Fort Lauderdale, United States Duration: 2017 Apr 20 → 2017 Apr 22 |
Conference
Conference | 20th International Conference on Artificial Intelligence and Statistics, AISTATS 2017 |
---|---|
Country/Territory | United States |
City | Fort Lauderdale |
Period | 17/4/20 → 17/4/22 |
ASJC Scopus subject areas
- Artificial Intelligence
- Statistics and Probability