Clarification of the metrics ¶
By: mrokuss on July 18, 2023, 1:40 p.m.
Dear organizers,
I was wondering if you cold provide more insights regarding the evaluation metrics. So far I do understand, that the metrics comprise
- cl-DICE
- Betti number errors
- Junction/landmark-based F1 score
However, I can not find the proposed "tutorial notebooks, evaluation code, conversion scripts" from the info box. When clicking on the given GitHub Link, it just redirects to the repo containing the website. Would it be possible to provide the evaluation code?
Thanks a lot! Best,
Max