How is the score being evaluated? ¶
By: bhamm on June 17, 2025, 8:51 a.m.
Dear Organizing Team,
we just noticed your test run on the leaderboard showing a score of 0.500000
. The associated metrics json contains the following:
{ "results": { "AUC": 0.5, "Sensitivity": 0.07999999999999999, "Specificity": 0.125 } }
According to the guidelines, the final score should be the average of Sensitivity and Specificity and AUC, with AUC used as a tie breaker.
Given that, we are wondering how the 0.500000
score was calculated based on the above values.
Could you please clarify?
Best regards, Benjamin Hamm