Regarding "overall-survival-years.json" file ¶
By: caodoanh2001 on May 31, 2024, 7:27 a.m.
Dear Organizer,
Can you provide the exact format of result file "overall-survival-years.json" processed by the container?
Thank you so much!
By: caodoanh2001 on May 31, 2024, 7:27 a.m.
Dear Organizer,
Can you provide the exact format of result file "overall-survival-years.json" processed by the container?
Thank you so much!
By: KhrystynaFaryna on June 2, 2024, 3:21 p.m.
Dear Participant,
Thank you for your question and apologies for the delay. The dummy submission example is now available at LEOPARD-challenge-submission-example Github rep. My colleague is currently wrapping up the official baseline repo, it should also be up in the upcoming days. In addition, the evaluation method code for the Sanity check phase is available at the LEOPARD-challenge-evaluation-method Github repo. This repo also includes the examples with the corresponding file "overall-survival-years.json" file. Please find the detailed explanation of how to proceed with the submission on the corresponding challenge Submission page. Please, reach out in case something is not clear. Thank you for participating, we are looking forward to your submission.
Kind regards, Khrystyna
By: caodoanh2001 on June 3, 2024, 3:05 a.m.
Dear the organizer,
After going through the example code, and the evaluation code, I understood that:
However, it is unclear how the directory tree of "./output" should be structured, because the example "inference.py" code only reads one image, and produces one "overall-survival-years.json" to '"/output" folder.
Could you please provide more details regarding to "./output"?
For example, it could be:
./outputs/
|_<WSI filename 1>
|__ overall-survival-years.json
|_<WSI filename 2>
|__ overall-survival-years.json
|_<WSI filename 3>
|__ overall-survival-years.json
|_ ...
|_<WSI filename N>
|__ overall-survival-years.json
Thank you so much.
By: caodoanh2001 on June 3, 2024, 3:45 a.m.
I think I understand. The Docker container should read one image at a time.