Model Calibration

  • Updated

Use the Model Calibration feature to recalibrate a model's prediction probabilities to reflect a different class distribution than that of the training distribution. Model Calibration trains and deploys an Isotonic Regression model to adjust prediction probabilities of a simClassify+ model. Model Calibration can be used in combination with Downsampling to reduce training time on highly class imbalanced data sets.

blobid0.png

Model Calibration uses a folder other than the training folder, as input containing a data set compatible with the Specifications File used in training that has ground truth values in the Class column. It uses the Calibration Folder as the target class distribution, adjusting both the prediction probability and the model’s threshold. 

Model Calibration does not make a model more or less accurate. However, it can be used to provide prediction confidence values that more accurately reflect real-world class distribution in cases where the training data was sampled as not real-world accurate. 

Model Calibration can be used in conjunction with Stratified Splitting and Random Downsampling to reduce training time in heavily class-imbalanced cases by downsampling the majority class prior to training and then calibrating the model with a smaller validation split.

The calibrated probability is listed as the Confidence response and the original response is provided in an additional field called Uncalibrated Confidence.

 

Was this article helpful?

0 out of 0 found this helpful

Comments

0 comments

Article is closed for comments.