Know Your Limits: Uncertainty Estimation with ReLU Classifiers Fails at Reliable OOD Detection

Dennis Thomas Ulmer, Giovanni Cinà

Research output: Journal Article or Conference Article in JournalConference articleResearchpeer-review

Abstract

A crucial requirement for reliable deployment of deep learning models for safety-critical applications is the ability to identify out-of-distribution (OOD) data points, samples which differ from the training data and on which a model might underper-form. Previous work has attempted to tackle this problem using uncertainty estimation techniques. However, there is empirical evidence that a large family of these techniques do not detect OOD reliably in classification tasks.

This paper gives a theoretical explanation for said experimental findings and illustrates it on synthetic data. We prove that such techniques are not able to reliably identify OOD samples in a classification setting, since their level of confidence is generalized to unseen areas of the feature space. This result stems from the interplay between the representation of ReLU networks as piece-wise affine transformations, the saturating nature of activation functions like softmax, and the most widely-used uncertainty metrics.
Original languageEnglish
JournalProceedings of Machine Learning Research
Volume161
Pages (from-to)1766-1776
ISSN2640-3498
Publication statusPublished - 2021
Event37th Conference on Uncertainty in Artificial Intelligence -
Duration: 27 Jul 202130 Jul 2021
https://www.auai.org/uai2021/

Conference

Conference37th Conference on Uncertainty in Artificial Intelligence
Period27/07/202130/07/2021
Internet address

Fingerprint

Dive into the research topics of 'Know Your Limits: Uncertainty Estimation with ReLU Classifiers Fails at Reliable OOD Detection'. Together they form a unique fingerprint.

Cite this