Toggle light / dark theme

DeepMind’s Epistemic Neural Networks Open New Avenues for Uncertainty Modelling in Large and Complex DL Systems

Posted in robotics/AI

Although effective uncertainty estimation can be a key consideration in the development of safe and fair artificial intelligence systems, most of today’s large-scale deep learning applications are lacking in this regard.

To accelerate research in this field, a team from DeepMind has proposed epistemic neural networks (ENNs) as an interface for uncertainty modelling in deep learning, and the KL divergence from a target distribution as a precise metric to evaluate ENNs. In the paper Epistemic Neural Networks, the team also introduces a computational testbed based on inference in a neural network Gaussian process, and validates that the proposed ENNs can improve performance in terms of statistical quality and computational cost.

The researchers say all existing approaches to uncertainty modelling in deep learning can be expressed as ENNs, presenting a new perspective on the potential of neural networks as computational tools for approximate posterior inference.

Leave a Reply