TY  - GEN
ID  - discovery10175941
AV  - public
CY  - Online conference
A1  - Rudi, Alessandro
A1  - Ciliberto, Carlo
KW  - Kernel methods
KW  -  Statistical Learning Theory
KW  -  Positive Definite Models
KW  -  Probabilistic Inference
KW  -  Bayesian Inference
KW  -  Decision Theory
KW  -  Density Estimation
KW  -  Probability Representation
EP  - 12
SN  - 1049-5258
T3  - Advances in Neural Information Processing Systems
Y1  - 2021/11/09/
UR  - https://proceedings.neurips.cc/paper_files/paper/2021/hash/a1b63b36ba67b15d2f47da55cdb8018d-Abstract.html
N2  - Finding a good way to model probability densities is key to probabilistic inference. An ideal model should be able to concisely approximate any probability while being also compatible with two main operations: multiplications of two models (product rule) and marginalization with respect to a subset of the random variables (sum rule). In this work, we show that a recently proposed class of positive semi-definite (PSD) models for non-negative functions is particularly suited to this end. In particular, we characterize both approximation and generalization capabilities of PSD models, showing that they enjoy strong theoretical guarantees. Moreover, we show that we can perform efficiently both sum and product rule in closed form via matrix operations, enjoying the same versatility of mixture models. Our results open the way to applications of PSD models to density estimation, decision theory, and inference.
PB  - NeurIPS Proceedings
TI  - PSD Representations for Effective Probability Models
N1  - This version is the version of record. For information on re-use, please refer to the publisher?s terms and conditions.
ER  -