TY - JOUR
T1 - Prior and Posterior Networks: A Survey on Evidential Deep Learning Methods For Uncertainty Estimation
AU - Ulmer, Dennis Thomas
AU - Hardmeier, Christian
AU - Frellsen, Jes
PY - 2023/4/4
Y1 - 2023/4/4
N2 - Popular approaches for quantifying predictive uncertainty in deep neural networks often involve distributions over weights or multiple models, for instance via Markov Chain sampling, ensembling, or Monte Carlo dropout. These techniques usually incur overhead by having to train multiple model instances or do not produce very diverse predictions. This comprehensive and extensive survey aims to familiarize the reader with an alternative class of models based on the concept of Evidential Deep Learning: For unfamiliar data, they admit "what they don't know" and fall back onto a prior belief. Furthermore, they allow uncertainty estimation in a single model and forward pass by parameterizing distributions over distributions. This survey recapitulates existing works, focusing on the implementation in a classification setting, before surveying the application of the same paradigm to regression. We also reflect on the strengths and weaknesses compared to other existing methods and provide the most fundamental derivations using a unified notation to aid future research.
AB - Popular approaches for quantifying predictive uncertainty in deep neural networks often involve distributions over weights or multiple models, for instance via Markov Chain sampling, ensembling, or Monte Carlo dropout. These techniques usually incur overhead by having to train multiple model instances or do not produce very diverse predictions. This comprehensive and extensive survey aims to familiarize the reader with an alternative class of models based on the concept of Evidential Deep Learning: For unfamiliar data, they admit "what they don't know" and fall back onto a prior belief. Furthermore, they allow uncertainty estimation in a single model and forward pass by parameterizing distributions over distributions. This survey recapitulates existing works, focusing on the implementation in a classification setting, before surveying the application of the same paradigm to regression. We also reflect on the strengths and weaknesses compared to other existing methods and provide the most fundamental derivations using a unified notation to aid future research.
KW - Predictive Uncertainty
KW - Deep Neural Networks
KW - Evidential Deep Learning
KW - Uncertainty Estimation
KW - Classification and Regression
KW - Bayesian Methods
KW - Parameterizing Distributions
KW - Markov Chain Sampling
KW - Monte Carlo Dropout
KW - Model Ensembling
KW - Predictive Uncertainty
KW - Deep Neural Networks
KW - Evidential Deep Learning
KW - Uncertainty Estimation
KW - Classification and Regression
KW - Bayesian Methods
KW - Parameterizing Distributions
KW - Markov Chain Sampling
KW - Monte Carlo Dropout
KW - Model Ensembling
M3 - Journal article
SN - 2835-8856
JO - Transactions on Machine Learning Research
JF - Transactions on Machine Learning Research
ER -