truelearn.utils.metrics.get_recall_score#

truelearn.utils.metrics.get_recall_score(act_labels: Iterable[bool], pred_labels: Iterable[bool], zero_division: Optional[int] = None) float[source]#

Get the recall score of the prediction.

The recall is the ratio tp / (tp + fn) where tp is the number of true positives and fn the number of false negatives. The recall is intuitively the ability of the classifier to find all the positive samples.

Parameters:
  • act_labels – An iterable of actual labels

  • pred_labels – An iterable of predicted labels

  • zero_division – Sets the value to return when there is a zero division. Defaults to None, which sets the value to zero and raises a warning. Acceptable values are 0 or 1, which sets the resulting value according.

Returns:

The recall score.