CONFORMAL PREDICTION APPLIED TO ENGINE FAILURE PREDICTION
This work investigates conformal prediction as a technique to provide reliable metrics of confidence in predictions made by machine learning models. Machine learning is increasingly being used for critical decision making, but ML algorithms do not always provide calibrated estimates of prediction probabilities. This creates challenges, especially in risk-sensitive applications where confidence information is important. Conformal prediction was originally proposed in 1999 as an approach that builds predictors on top of traditional ML algorithms, but provides valid measures of confidence for each prediction. In this work, we will review the key concepts of conformal prediction and present partial results of a case study using a real-world dataset. Our aim is to evaluate whether conformal prediction can offer a more reliable alternative for measuring confidence in ML-based predictions.