jingweimo的个人博客分享 http://blog.sciencenet.cn/u/jingweimo

博文

[转载]Inverse Relationship Between Precision and Recall

已有 2207 次阅读 2019-8-21 11:10 |系统分类:科研笔记|文章来源:转载

https://datascience.stackexchange.com/questions/49117/inverse-relationship-between-precision-and-recall/49121#49121

https://medium.com/@timothycarlen/understanding-the-map-evaluation-metric-for-object-detection-a07fe6962cf3

https://tarangshah.com/blog/2018-01-27/what-is-map-understanding-the-statistic-of-choice-for-comparing-object-detection-models/

https://github.com/tensorflow/models/tree/master/research/object_detection#tensorflow-object-detection-api

http://host.robots.ox.ac.uk/pascal/VOC/pubs/everingham10.pdf

https://www.pyimagesearch.com/2016/11/07/intersection-over-union-iou-for-object-detection/

If we decrease the false negative (select more positives), recall always increases, but precision may increase or decrease. Generally, for models better than random, precision and recall have an inverse relationship, but for models worse than random, they have a directrelationship.

It is worth noting that we can artificially build a sample that causes a model which is better-than-random on true distribution to perform worse-than-random, so we are assuming that the sample resembles the true distribution.

Recall

We have

TP=PFNTP=P−FN


therefore, recall would be


r=PFNP=1FNPr=P−FNP=1−FNP


which always increases by decreasing in FNFN.


Precision

For precision, the relation is not as straightforward. Lets start with two examples.

First case: decrease in precision, by decrease in false negative:

label   model prediction
1       0.8
0       0.2
0       0.2
1       0.2

For threshold 0.50.5 (false negative = {(1,0.2)}{(1,0.2)}),


p=11+0=1p=11+0=1


For threshold 0.00.0 (false negative = {}{}),


p=22+2=0.5p=22+2=0.5


Second case: increase in precision, by decrease in false negative (the same as @kbrose example):

label   model prediction
0       1.0
1       0.4
0       0.1

For threshold 0.50.5 (false negative = {(1,0.4)}{(1,0.4)}),

p=00+1=0p=00+1=0


For threshold 0.00.0 (false negative = {}{}),

p=11+2=0.33p=11+2=0.33


It is worth noting that ROC curve for this case is

Analysis of precision based on ROC curve

When we lower the threshold, false negative decreases, and true positive [rate] increases, which is equivalent to moving to the right in ROC plot. I did a simulation for better-than-random, random, and worse-than-random models, and plotted ROC, recall, and precision:

As you can see, by moving to the right, for better-than-random model, precision decreases, for random model, precision has substantial fluctuations, and for worse-than-random model precision increases. And there are slight fluctuations in all three cases. Therefore,

By increase in recall, if model is better than random, precision generally decreases. If mode is worse than random, precision generally increases.




https://blog.sciencenet.cn/blog-578676-1194555.html

上一篇:[转载]What are the differences between a relay and solenoid
下一篇:[转载]Hydraulics 101 for Beginners
收藏 IP: 68.83.204.*| 热度|

0

该博文允许注册用户评论 请点击登录 评论 (0 个评论)

数据加载中...
扫一扫,分享此博文

Archiver|手机版|科学网 ( 京ICP备07017567号-12 )

GMT+8, 2024-11-23 18:47

Powered by ScienceNet.cn

Copyright © 2007- 中国科学报社

返回顶部