mlresearch.metrics.precision_at_k

mlresearch.metrics.precision_at_k(y_true, y_score, k=10, target_label=1)[source]

Calculate precision at k, where k is the number of relevant items to consider (sorted in descending order by its score). This metric consists of the ration between the number of items with label target_label, out of the top k items with highest scores.

Warning

This metric is not the same as sklearn.metrics.top_k_accuracy_score, which calculates the amount of times y_true is within the top k predicted classes for each item.

Parameters:
y_truearray-like of shape (n_samples,)

True labels.

y_scorearray-like of shape (n_samples,) or (n_samples, n_classes)

Target scores. These can be either probability estimates or non-thresholded decision values (as returned by decision_function on some classifiers). Expects scores with shape (n_samples,).

kint, default=10

Number of most likely predictions considered to compute the number of correct labels.

target_labelint, default=1

Value of the label with relevant items.