large training dataset leads to error "out of memory. Type "help memory" for your options. caused by: out of memory. Type "help memory " for your options".

3 views (last 30 days)
hello! I am currently working on semantic segmentation project following the code mentioned in link
"https://www.mathworks.com/help/vision/ug/semantic-segmentation-using-deep-learning.html".
I am using 2500 original images and 2500 labeled images with 720x960x3 size and 50 classes to train the semantic segmentation (Resnet network) on RTX 2080Ti, 11GB GPU, but I am facing the error, " out of memory. Type "help memory" for your options. caused by: out of memory. Type "help memory " for your options".
Thankyou in advance.
  4 Comments

Sign in to comment.

Answers (1)

Joss Knight
Joss Knight on 14 Mar 2021
Have you followed the following advice from the example?:
"The images in the CamVid data set are 720 by 960 in size. Image size is chosen such that a large enough batch of images can fit in memory during training on an NVIDIA™ Titan X with 12 GB of memory. You may need to resize the images to smaller sizes if your GPU does not have sufficient memory or reduce the training batch size."
In your case you are running out of CPU memory. Perhaps you are attempting to load your entire dataset into MATLAB as a single variable rather than storing the data in files?
  5 Comments
Joss Knight
Joss Knight on 18 Mar 2021
I am not an expert on Deep Learning. My understanding is that for semantic segmentation, you are better off cropping the input than downsampling it. However it is normal for the model's trained input to be smaller than the images it is used to process - the semanticseg function takes care of this.

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!