Note: This page has been translated by MathWorks. Please click here

To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

Evaluate precision metric for object detection

`averagePrecision = evaluateDetectionPrecision(detectionResults,trainingData)`

```
[averagePrecision,recall,precision]
= evaluateDetectionPrecision(___)
```

`[___] = evaluateDetectionPrecision(___,threshold)`

returns the average precision, of the `averagePrecision`

= evaluateDetectionPrecision(`detectionResults`

,`trainingData`

)`detectionResults`

compared to the `trainingData`

. You can use the average
precision to measure the performance of an object detector. For a multiclass
detector, the function returns `averagePrecision`

as a
vector of scores for each object class in the order specified by
`trainingData`

.

`[`

returns data
points for plotting the precision–recall curve, using input
arguments from the previous syntax.`averagePrecision`

,`recall`

,`precision`

]
= evaluateDetectionPrecision(___)

`[___] = evaluateDetectionPrecision(___,`

specifies
the overlap threshold for assigning a detection to a ground truth
box.`threshold`

)