How is your model doing?
A quick glance of your most important metrics.
Last 10 Evaluations
Accuracy
The proportion of all instances that are correctly predicted.
0.89
▲
0.7
minimum
threshold
minimum
threshold
Precision
The proportion of predicted positive instances that are correctly predicted.
0.93
▲
0.7
minimum
threshold
minimum
threshold
Recall
The proportion of actual positive instances that are correctly predicted. Also known as recall or true positive rate.
1.0
▲
0.7
minimum
threshold
minimum
threshold
F1 Score
The harmonic mean of precision and recall.
0.07
▼
0.7
minimum
threshold
minimum
threshold
AUROC
The area under the receiver operating characteristic curve (AUROC) is a measure of the performance of a binary classification model.
0.93
▲
0.7
minimum
threshold
minimum
threshold
Average Precision
The area under the precision-recall curve (AUPRC) is a measure of the performance of a binary classification model.
0.98
▲
0.7
minimum
threshold
minimum
threshold