lack of memory

lack of memory  

  By: 坤坤kk on July 24, 2024, 11:35 a.m.

Hello organizer, in my latest algorithm submission, I only required 10GB of VRAM to infer a case during my local testing. However, when I submitted it for the online prediction phase of the competition, it showed that there was not enough VRAM. Could you help me take a look at the logs from my most recent submission? Thank you!!!

Re: lack of memory  

  By: imran.muet on July 24, 2024, 8:06 p.m.

Thank you for reaching out. According to the error log, “The container was killed as it exceeded its memory limit.” To address this issue, please consider the following steps:

  1. Optimize Your Code: Remove any unnecessary processing or lines from your inference file.
  2. Simplify Progress Tracking: If you are using tqdm or similar tools to print the progress of your inference, remove them and only keep the essential lines needed for segmentation.

I hope these suggestions help resolve the problem.

Re: lack of memory  

  By: 坤坤kk on July 26, 2024, 2:13 a.m.

Hello, thank you very much for your response and suggestions. I have optimized the code and successfully ran the algorithm in the "Try-out Algorithm" environment. However, when I submitted the algorithm to the competition, it failed again. Could you please help me review the specific log information? Thank you very much.

Re: lack of memory  

  By: imran.muet on July 26, 2024, 1:33 p.m.

This time your algorithm was able to produce the segmentation for almost all input images within five minutes, but it failed on just one image during the competition.

Here's a suggestion that might help:

  • Select ten input volumes from your training data with the largest size. These will serve as a good test to ensure your algorithm can handle large 3D scans efficiently.

  • Optimize your algorithm to ensure it can consistently produce segmentation within the five-minute limit for these larger volumes. This may involve refining your code, improving memory management, or implementing more efficient processing techniques.

  • Once you've made these optimizations, try running the algorithm again in the competition environment.

Please let me know if you need further assistance with the logs or any other aspect of the algorithm. You've made excellent progress, and I'm confident that with a few more tweaks, you'll achieve the desired results.

All the best with your submission!

Re: lack of memory  

  By: 坤坤kk on July 27, 2024, 10:25 a.m.

Hello, thank you very much for your help. I have successfully run the algorithm in the competition. However, in the test cases, the Dice score for two volumes is completely zero. Could you please inform me of the shapes of these two volumes? This is important for my data preprocessing. If possible, providing the data for these two volumes or the predicted masks would be greatly appreciated!

Re: lack of memory  

  By: imran.muet on July 29, 2024, 3:32 p.m.

Thank you for reaching out! I’m glad to hear you’ve successfully run the algorithm. For the issue with the Dice score being zero for two volumes, you might find the discussion on the training and validation dataset helpful. Please visit the following link.

If you need further assistance, feel free to ask!

Re: lack of memory  

  By: 坤坤kk on Aug. 3, 2024, 9 a.m.

Hello, my two recently submitted algorithms have failed testing, can you help me look at the log information? I would be very grateful.

Re: lack of memory  

  By: imran.muet on Aug. 3, 2024, 10:40 a.m.

Hi, the log of your error is

 ValueError: could not broadcast input array from shape (715,369,384) into shape (715,8,384)

It seems the shape of the 3D input volume and the 3D segmentation output do not match.