Know Your Limits: Uncertainty Estimation with ReLU Classifiers Fails at Reliable OOD Detection

Dennis Thomas Ulmer, Giovanni Cinà

Publikation: Artikel i tidsskrift og konference artikel i tidsskriftKonferenceartikelForskningpeer review

Abstract

A crucial requirement for reliable deployment of deep learning models for safety-critical applications is the ability to identify out-of-distribution (OOD) data points, samples which differ from the training data and on which a model might underper-form. Previous work has attempted to tackle this problem using uncertainty estimation techniques. However, there is empirical evidence that a large family of these techniques do not detect OOD reliably in classification tasks.

This paper gives a theoretical explanation for said experimental findings and illustrates it on synthetic data. We prove that such techniques are not able to reliably identify OOD samples in a classification setting, since their level of confidence is generalized to unseen areas of the feature space. This result stems from the interplay between the representation of ReLU networks as piece-wise affine transformations, the saturating nature of activation functions like softmax, and the most widely-used uncertainty metrics.
OriginalsprogEngelsk
TidsskriftProceedings of Machine Learning Research
Vol/bind161
Sider (fra-til)1766-1776
ISSN2640-3498
StatusUdgivet - 2021
Begivenhed37th Conference on Uncertainty in Artificial Intelligence -
Varighed: 27 jul. 202130 jul. 2021
https://www.auai.org/uai2021/

Konference

Konference37th Conference on Uncertainty in Artificial Intelligence
Periode27/07/202130/07/2021
Internetadresse

Emneord

  • Out-of-Distribution Detection
  • Deep Learning
  • ReLU Networks
  • Uncertainty Estimation
  • Safety-critical Applications

Fingeraftryk

Dyk ned i forskningsemnerne om 'Know Your Limits: Uncertainty Estimation with ReLU Classifiers Fails at Reliable OOD Detection'. Sammen danner de et unikt fingeraftryk.

Citationsformater