Visual Intelligence aims to develop models that can estimate confidence and quantify the uncertainty of their predictions involving complex image data.
Motivation
Deep neural networks are powerful predictive models, but they are often incapable of recognizing when their predictions may be wrong or whether the input is outside the range of which the system is expected to safely perform. For critical or automatic applications, knowledge about the confidence of predictions is essential.
Solving research challenges with new deep learning methodology
Visual Intelligence has developed novel methods which better estimate the confidence and quantify the uncertainty of their predictions. Examples include methods for:
• quantifying uncertainty in pre-trained networks for sandeel segmentation in echosounder data.
• quanityfing the uncertainty when identifying geological layers.
• oil spill detection, with a particular emphasis on achieving uncertainty quantification in deep learning models for remot sensing data analysis.
By better estimating confidence and quantifying uncertainty, our proposed methods contribute to making deep learning models more robust, reliable, and trustworthy. They also become more useful in real-world scenarios where uncertainty might be inevitable.