Abstract
A Neural Process (NP) is a map from a set of observed input-output pairs to a predictive distribution over functions, which is designed to mimic other stochastic processes' inference mechanisms. NPs are shown to work effectively in tasks that require complex distributions, where traditional stochastic processes struggle, e.g. image completion tasks. This paper concerns the practical capacity of set function approximators despite their universality. By delving deeper into the relationship between an NP and a Bayesian last layer (BLL), it is possible to see that NPs may struggle in simple examples, which other stochastic processes can easily solve. In this paper, we propose a simple yet effective remedy; the Residual Neural Process (RNP) that leverages traditional BLL for faster training and better prediction. We demonstrate that the RNP shows faster convergence and better performance, both qualitatively and quantitatively.
Original language | English |
---|---|
Title of host publication | AAAI 2020 - 34th AAAI Conference on Artificial Intelligence |
Publisher | AAAI press |
Pages | 4545-4552 |
Number of pages | 8 |
ISBN (Electronic) | 9781577358350 |
DOIs | |
Publication status | Published - 2020 |
Externally published | Yes |
Event | 34th AAAI Conference on Artificial Intelligence, AAAI 2020 - New York, United States Duration: 2020 Feb 7 → 2020 Feb 12 |
Publication series
Name | AAAI 2020 - 34th AAAI Conference on Artificial Intelligence |
---|
Conference
Conference | 34th AAAI Conference on Artificial Intelligence, AAAI 2020 |
---|---|
Country/Territory | United States |
City | New York |
Period | 20/2/7 → 20/2/12 |
Bibliographical note
Publisher Copyright:Copyright © 2020, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.
ASJC Scopus subject areas
- Artificial Intelligence