Modulating scalable Gaussian processes for expressive statistical learning
For a learning task, Gaussian process (GP) is interested in learning the statistical relationship between inputs and outputs, since it offers not only the prediction mean but also the associated variability. The vanilla GP however is hard to learn complicated distribution with the property of, e.g.,...
Main Authors: | Liu, Haitao, Ong, Yew-Soon, Jiang, Xiaomo, Wang, Xiaofang |
---|---|
Other Authors: | School of Computer Science and Engineering |
Format: | Journal Article |
Language: | English |
Published: |
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/162582 |
Similar Items
-
Understanding and comparing scalable Gaussian process regression for big data
by: Liu, Haitao, et al.
Published: (2020) -
When Gaussian process meets big data : a review of scalable GPs
by: Liu, Haitao, et al.
Published: (2021) -
Remarks on multi-output Gaussian process regression
by: Liu, Haitao, et al.
Published: (2020) -
Cope with diverse data structures in multi-fidelity modeling : a Gaussian process method
by: Liu, Haitao, et al.
Published: (2020) -
A convergence theorem for extreme values from Gaussian sequences,
by: Welsch, Roy E.
Published: (2009)