Docker's limitations ?

Docker's limitations ?  

  By: maximilien.charmetant on Aug. 2, 2023, 1:39 p.m.

We would like to have more details about the docker's limitations.

Is there a time limit for the inference of our model? Is there also a memory limit for the docker and our model?

Thanks in advance

Re: Docker's limitations ?  

  By: luogongning on Aug. 4, 2023, 5:51 a.m.

While we don't have a strict time limit for model inference, we recommend that it be kept within a feasible timeframe, roughly one hour, when run on an RTX3090 GPU.

As for the memory limitation, your Docker and model should be designed to consume less than 128GB of memory.

Re: Docker's limitations ?  

  By: maximilien.charmetant on Aug. 5, 2023, 2:02 p.m.

Thank you very much for you response.

We have one more question though, we are wondering if you mean "an hour" for the full segmentation, detection and classification over all patients or just for one patient ?

Thank you for your time

Re: Docker's limitations ?  

  By: luogongning on Aug. 6, 2023, 3:04 a.m.

The expectation is not for an exact duration of one hour, but rather that you should aim to complete the prediction for each patient per task in several minutes