Evaluation metrics for deep learning model model

1 view (last 30 days)
What is the command to be used for computing the evaluation metrics for a deep learning model such as precision, recall, specificity, F1 score.
Should it explicitly computed from the Confusion matrix by using the standard formulas or can it be directly computed in the code and displayed.
Also are these metrics computed on the Validation dataset.
Kindly provide inputs regarding the above.

Accepted Answer

Pranjal Kaura
Pranjal Kaura on 23 Nov 2021
Edited: Pranjal Kaura on 23 Nov 2021
Hey Sushma,
Thank you for bringing this up. The concerned parties are looking at this issue and will try to roll it in future releases.
For now you can compute these metrics using the confusion matrix. You can refer to this link.
Hope this helps!
  2 Comments
Sushma TV
Sushma TV on 25 Nov 2021
Thanks Pranjal. I went through the link that you sent but have a doubt in plotting Precision and Recall plots. Computation of values using Confusion matrix was possible but could not figure out the plots. What are the arguments of the function perfcurves to plot Precision- Recall curve?
Pranjal Kaura
Pranjal Kaura on 26 Nov 2021
'perfcurve' is used for plotting performance curves on classifier outputs. To plot a Precision-Recall curve you can set the 'XCrit' (Criterion to compute 'X') and YCrit to 'reca' and 'prec' respectively, to compute recall and precision. You can refer the following code snippet:
[X, Y] = perfcurve(labels, scores, posclass, 'XCrit', 'reca', 'YCrit', 'prec');

Sign in to comment.

More Answers (0)

Products


Release

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!