[Failed] The container ran out of memory.

[Failed] The container ran out of memory.  

  By: 1274227493 on Sept. 14, 2023, 10:01 a.m.

On Sept. 14, 2023, at 3:32 p.m., I submitted my algorithm during the Final Testing Phase, but it failed with the error message "Failed The container ran out of memory." My Docker file size is 6.65GB. During runtime, the algorithm converts videos to images and stores them in the "./temp_images/"directory. After the model prediction, the algorithm deletes the images in the "./temp_images/" directory. I noticed that there are two stages during the algorithm submission process: the first stage is "Executing Algorithm", and the second stage is "Executing". In my algorithm, the "Executing Algorithm" stage executes correctly and predicts a total of 108 videos, but it fails during the "Executing" stage with the error "Failed The container ran out of memory." The Grand Challenge platform supports a maximum memory space of 32GB. Can I request to increase the memory allocation, or how can I reduce my memory consumption?

Re: [Failed] The container ran out of memory.  

  By: 1274227493 on Sept. 14, 2023, 2:46 p.m.

My understanding is that "Executing Algorithm" refers to model prediction, while "Executing" refers to evaluating the predicted results. Is my understanding correct? If so, could the reason for my "out of memory" issue be that the "surgical-tools.json" file is too large?

Re: [Failed] The container ran out of memory.  

  By: 1274227493 on Sept. 15, 2023, 12:58 a.m.

I apologize for the disturbance. Indeed, the "out of memory" issue was caused by the large size of the "surgical-tools.json" file, specifically due to the presence of a large number of boxes, which led to memory overload during evaluation. I have increased the probability threshold for writing boxes to the JSON file (0.01 to 0.1) and have completed the submission.