truelearn.utils.metrics.get_precision_score(act_labels: Iterable[bool], pred_labels: Iterable[bool], zero_division: Optional[int] = None) float[source]#

Get the precision score of the prediction.

The precision is the ratio tp / (tp + fp) where tp is the number of true positives and fp the number of false positives. The precision is intuitively the ability of the classifier not to label as positive a sample that is negative.

  • act_labels – An iterable of actual labels

  • pred_labels – An iterable of predicted labels

  • zero_division – Sets the value to return when there is a zero division. Defaults to None, which sets the value to zero and raises a warning. Acceptable values are 0 or 1, which sets the resulting value according.


The precision score.