Failed in the live leaderboard section

Failed in the live leaderboard section  

  By: AustinChen on Dec. 8, 2024, 11:25 a.m.

Hi, I am wondering why my algorithm failed in the live leaderboard section, even though it succeeded in the debugging section. Additionally, the leaderboard cases I submitted previously, which succeeded before the recent evaluation update, have also changed to failed. Could you clarify the actual differences between the debugging and the leaderboard phase? Or is there any way to check the error log in the leaderboard section? Unlike the debugging phase, I can't find the output log in the algorithm result section.

Re: Failed in the live leaderboard section  

  By: gdeotale123 on Dec. 9, 2024, 8:37 a.m.

I am facing same issue. if it is working in debugging session it should work in live leatherboard.. else how are we suppose to understand the issue

Re: Failed in the live leaderboard section  

  By: LindaSt on Dec. 9, 2024, 1:19 p.m.

Hi! These are the logs I can see:

Results ID 7ba009fe-e344-47d3-948d-07c0d3a80d7c: "Time limit exceeded" (AustinChen) --> I'm sending GC a message to see if anything changed on their end / ask for more information.

Results ID df713386-6f45-4dae-a2ac-43803e5817cf: "Container Image 006a67d7-26c5-4144-802c-0bc53225a30e was not ready to be used" (gdeotale123) --> GC has told me that the container used for this submission was no longer active, which is why the re-evaluation failed. I suggest you upload/activate the container again and try to resubmit yourself. Let me know if that does not work.

Best, Linda

Re: Failed in the live leaderboard section  

  By: gdeotale123 on Dec. 9, 2024, 4:15 p.m.

Resubmit worked well. Can u also let us know how much time it took for execution. Can we safely assume if it works in live leatherboard then irrespective of time it should pass through test phase. As there are 28 images in test phase

Re: Failed in the live leaderboard section  

  By: LindaSt on Dec. 10, 2024, 11:54 a.m.

Currently, the time budget is 30 minutes per algorithm job, i.e., per WSI. However, there is apparently a little bit of an overhead when increasing the number of cases regarding the evaluation and the algorithm scheduling. We'll try to account for that in the test phase. In the unlikely case that your algorithm ran without issues on the validation set and runs over time in the test phase, we'll fix that so you can still participate.

 Last edited by: LindaSt on Dec. 10, 2024, 3:13 p.m., edited 1 time in total.

Re: Failed in the live leaderboard section  

  By: LindaSt on Dec. 10, 2024, 3:15 p.m.

@gdeotale123, is there anything that would explain why your algorithm took longer for the live leaderboard than for the debugging? Did you make any changes? The time restrictions are the same for debugging and live leaderboard. I suggest you try to re-submit to exclude that it was a temporary issue on the server side.

Re: Failed in the live leaderboard section  

  By: AustinChen on Dec. 11, 2024, 4:04 a.m.

Thanks for your respond! I was wondering if the issue is related to the GPU enviornment, causing the program to fail to detect the GPU and resulting in slower performance. Could you provide detailed information about the GPU environment? Did the environment change after the recent update?

Re: Failed in the live leaderboard section  

  By: LindaSt on Dec. 11, 2024, 9:10 a.m.

Hi! After a bit more discussion with GC, I learned that the runtime, of course, also depends on the file size to process. To adjust for the fact that our newly added case might be larger, we've upped the time limit to 35 minutes. I see in the log that yours took 33 minutes, so if you resubmit, you should be okay now. I did not hear anything from GC regarding changes in the run environment. You can find more info on that here.

Re: Failed in the live leaderboard section  

  By: AustinChen on Dec. 13, 2024, 10:55 a.m.

Hi, Thanks for your reply again! I just resubmitted the same container, but it still failed during the leaderboard phase. Is it possible to provide information such as the image size and the ROI ratio? This would help us test locally.