Offical Suggested Model Size / Inference time limit on docker submission

Offical Suggested Model Size / Inference time limit on docker submission  

  By: jyuanfeng8 on July 19, 2022, 2:47 p.m.

Dear participants.

Considering the questions of many participants about the runtime and size limits of the submitted dockers, we officially set the following limits on the submitted models for a more reasonable and fair comparison. 1) The submitted model must be able to run reasonably on one 3090 GPU with 128GB memory. 2) Referring to previous competitions, we suggest to limit the reasonable infer time for each case to 15 minutes (based on 3090 GPU), and considering the number of test data for task1 and task2, the submitted models is suggested to be finished testing around 3 days (for each task). Considering such sudden changes, we will not penalize for exceeding the time, we hope that teams will strictly adhere to the rule. Extreme cases will be handled separately.

We apologize for the lack of consideration of this issue and hope participants to adjust the number of models reasonably. We will postpone the submission time to the 22nd to mitigate the impact on the participants.

Good luck!

 Last edited by: jyuanfeng8 on Aug. 15, 2023, 12:56 p.m., edited 6 times in total.

Re: Offical Model Size / Inference time limit on docker submission  

  By: taich.brave on July 19, 2022, 3:27 p.m.

Hi, organizers I understand what you want and mean, and all participants are responsible for the lack of mentions. However, 5% score penalty is really critical in this competition. We confirmed our model's inference time was within 72hours, but I'm afraid that it changes depending final test dataset size or CPU. Actually I examined it varied between 32 and 40 hours for evaluation set in my two local machine (same memory and GPU). Moreover, this announcement is very sudden, and I don't have enough time to fix my paper and docker container. Can you consider well the time the penalties are added?

Re: Offical Model Size / Inference time limit on docker submission  

  By: taich.brave on July 19, 2022, 4 p.m.

Earlier in this contest, you said to me 'we own sufficient resources to make sure it runs smoothly' and posted this URL about inference time a week ago. As I have said many times, I understand that too long a inference time is contrary to the purpose of this challenge, but too sudden changes are also not a fair evaluation.

 Last edited by: taich.brave on Aug. 15, 2023, 12:56 p.m., edited 1 time in total.

Re: Offical Model Size / Inference time limit on docker submission  

  By: jyuanfeng8 on July 19, 2022, 4:23 p.m.

Hi, thanks for your comments, it's true that this one sudden change will have a huge impact, and given that some submissions do fluctuate in time, we will not be penalizing submissions that are overdue. In fact, at this point, we can only hope that teams will strictly adhere to the rule of completing the test in about 3 days, and we appreciate your understanding!

 Last edited by: jyuanfeng8 on Aug. 15, 2023, 12:56 p.m., edited 2 times in total.

Re: Offical Suggested Model Size / Inference time limit on docker submission  

  By: Isensee on July 19, 2022, 5:28 p.m.

Just to clarify: is this 3 days for each task?

Re: Offical Suggested Model Size / Inference time limit on docker submission  

  By: jyuanfeng8 on July 19, 2022, 11:43 p.m.

@Isensee, Yes.

Re: Offical Suggested Model Size / Inference time limit on docker submission  

  By: small_dark on July 20, 2022, 2:17 a.m.

Hi @jyuanfeng8

Very sad to hear this sudden time limit policy. Due to the need to adjust our models, we think reopening the evaluation submission to validate new models may be a reasonable solution?

Re: Offical Suggested Model Size / Inference time limit on docker submission  

  By: jyuanfeng8 on July 20, 2022, 3:38 a.m.

@small_dark, please note that 3 days is just suggested test time. After a more thorough discussion, we will not place any limits on submissions beyond that time (extreme cases will be handled on a case-by-case basis, such as using dozens of models), so please feel free to submit as originally planned. Thank you for your understanding.