A comparison of uncertainty estimation approaches in deep learning components for autonomous vehicle applications

  • Fabio Arnez
  • , Huascar Espinoza
  • , Ansgar Radermacher
  • , François Terrier

Research output: Contribution to journalConference articlepeer-review

4 Citations (Scopus)

Abstract

A key factor for ensuring safety in Autonomous Vehicles (AVs) is to avoid any abnormal behaviors under undesirable and unpredicted circumstances. As AVs increasingly rely on Deep Neural Networks (DNNs) to perform safety-critical tasks, different methods for uncertainty quantification have recently been proposed to measure the inevitable source of errors in data and models. However, uncertainty quantification in DNNs is still a challenging task. These methods require a higher computational load, a higher memory footprint, and introduce extra latency, which can be prohibitive in safety-critical applications. In this paper, we provide a brief and comparative survey of methods for uncertainty quantification in DNNs along with existing metrics to evaluate uncertainty predictions. We are particularly interested in understanding the advantages and downsides of each method for specific AV tasks and types of uncertainty sources.

Original languageEnglish
JournalCEUR Workshop Proceedings
Volume2640
DOIs
Publication statusPublished - 2020
Externally publishedYes
Event2020 Workshop on Artificial Intelligence Safety, AISafety 2020 - Yokohama, Japan
Duration: 5 Jan 202110 Jan 2021

Fingerprint

Dive into the research topics of 'A comparison of uncertainty estimation approaches in deep learning components for autonomous vehicle applications'. Together they form a unique fingerprint.

Cite this