This video explains computing the AUC metric for an SVM classifier, or other classifiers that give the absolute class values as outcomes.

### What is Area Under the Curve ?

AUC is the area under the ROC curve. It is a popularly used classification metric.

If you want to recap how AUC works, here is a simple video on how AUC works.

### Platt Scaling: How to Compute AUC for an SVM Classifier ?

Classifiers such as logistic regression and naive Bayes predict class probabilities as the outcome instead of the predicting the labels themselves. A new data point is classified as positive if the predicted probability of positive class is greater a **threshold**. Each threshold leads to a different classifier. Hence, typical metrics such as accuracy and F1 score depend on the threshold one picks. AUC for such classifiers gives an aggregated metric across thresholds.

*Some classifiers such as an SVM or a perceptron give the class labels directly as the outcome and not class probabilities. ***Does is make sense to compute the AUC metric for classifiers such as the SVM which give class labels as outcome? **

The answer is Yes. It is often useful to get class probability outcomes instead of absolute class values.

The video above explains computing the AUC metric for an SVM classifier, or other classifiers that give the absolute class values as outcomes.

The video also explains the process of *calibrating* the outcomes of such classifiers to get class probabilities, from which one can compute the AUC.