Learning Calibrated Belief Functions from Conformal Predictions - Connaissances, Incertitudes et Données Access content directly
Conference Papers Year : 2023

Learning Calibrated Belief Functions from Conformal Predictions

Abstract

We consider the problem of supervised classification. We focus on the problem of calibrating the classifier's outputs. We show that the p-values provided by Inductive Conformal Prediction (ICP) can be interpreted as a possibility distribution over the set of classes. This allows us to use ICP to compute a predictive belief function which is calibrated by construction. We also propose a learning method which provides p-values in a simpler and faster way, by making use of a multi-output regression model. Results obtained on the Cifar10 and Digits data sets show that our approach is comparable to standard ICP in terms of accuracy and calibration, while offering a reduced complexity and avoiding the use of a calibration set.
Fichier principal
Vignette du fichier
martinbordini23.pdf (481.39 Ko) Télécharger le fichier
Origin : Publisher files allowed on an open archive

Dates and versions

hal-04371387 , version 1 (03-01-2024)

Identifiers

  • HAL Id : hal-04371387 , version 1

Cite

Vitor Martin Bordini, Sébastien Destercke, Benjamin Quost. Learning Calibrated Belief Functions from Conformal Predictions. 13th International Symposium on Imprecise Probabilities: Theories and Applications (ISIPTA 2023), Jul 2023, Oviedo, Spain. ⟨hal-04371387⟩
9 View
11 Download

Share

Gmail Facebook X LinkedIn More