A useful metric for determining the appropriate classifier for an application is AUC: Area Under the (ROC) Curve. The Grid Results table lists the AUC values, however, different models can have similar AUCs, but different True Positive Rate versus False Positive Rate behavior. The probabilistic threshold search will find specific Recall metric values for a model. Sometimes you are not looking for the maximum Recall for a class and, instead, are trying to decide what Recall/Precision tradeoff to make for your application. In those cases you will find looking at the ROC curve useful. (ROC: Receiver Operating Characteristics.)
To display the ROC curves for the models in a grid, in the grid model select the Grid Analysis item from the Model Actions menu.
Select ROC Plots for the Grid Analysis Report drop down. (Currently, the only report option available.) The models from the Grid will be shown along with their AUC values.
Selecting the checkbox in the table header will display all the curves for all the models in the grid:
Deselecting the header checkbox will remove all the plots. Similarly, you can select and deselect individual models with their associated checkboxes to display just the selected plots.
The plots are based upon the threshold searched for and not all thresholds are examined during grid searches. In those cases, you might see a plot of just the portion that was explored.
Like the Grid Results tables, the blue Create Model buttons can be used to navigate to the Create Model page with the associated model parameters filled in. You will need to modify the Threshold parameter to match your desired results from your examination of the ROC curve.
Comments
0 comments
Please sign in to leave a comment.