What to Do When Your Submission Fails? ¶
By: jeremyzhang on July 31, 2023, 9:59 a.m.
Hello, participants who are reading this message,
We have noticed that many of your submissions are encountering errors. Here, we provide some guidance on how to troubleshoot when your algorithm Docker submission fails:
Firstly, on your local platform, please execute the test.sh
script to ensure that your Docker's predictions match the local predictions, and you receive the output "Tests successfully passed..."
. If the predictions do not match, please carefully review your code.
However, if you compare expected_output.json
and Docker's predictions closely and find that there are slight differences in the coordinates of the key points, it may be due to variations in torch /tensorflow versions. You can modify the versions to resolve this issue or ignore the decimal point differences if you don't mind.
Secondly, on the algorithm page, use "Try-out Algorithm"
to validate whether your Docker runs correctly on the Grand-Challenge platform. For specific instructions, you can refer to the README in this repository: CL-Detection2023-Reference-Docker.
After following these steps, most likely you will not encounter any further issues. However, if you still experience submission errors, please leave a comment below this post. I will respond promptly and actively assist you.