8.17.1.9. sklearn.metrics.average_precision_score¶
- sklearn.metrics.average_precision_score(y_true, y_score)¶
Compute average precision (AP) from prediction scores.
This score corresponds to the area under the precision-recall curve.
Note: this implementation is restricted to the binary classification task.
Parameters : y_true : array, shape = [n_samples]
true binary labels
y_score : array, shape = [n_samples]
target scores, can either be probability estimates of the positive class, confidence values, or binary decisions.
Returns : average_precision : float
See also
- auc_score
- Area under the ROC curve
References
http://en.wikipedia.org/wiki/Information_retrieval#Average_precision