Transfer learning without imresize or imageDatastore

2 views (last 30 days)
For many samples of small images, it would be nice to load the data into RAM and perform transfer learning without using images on slower memory (imageDatastore). Unfortunately, if I had 50x50x1 images and had to resize to go with alexnet, I will be forced to use (50x50x1/227x227x3) ~ 1.6% of the number of samples in order to keep everything in RAM.
Does anyone know a fix? A custom layer that resizes would work, that's a lot of work though.
using 2017b

Answers (1)

Shounak Mitra
Shounak Mitra on 20 Aug 2018
Hi Michael,
Thanks for your question.
If you need to resize, apply augmentedimagedatastore to your image datastore - it will be much faster than a custom readfcn, because it preserves prefetch under the hood. If you don’t need to resize, augment, or any other need for a custom readfcn, then vanilla imds is simplest.
Another option is to create a custom layer but you're right, it'll take some work.
HTH Shounak
  1 Comment
Michael Benton
Michael Benton on 2 Sep 2018
Edited: Michael Benton on 4 Sep 2018
https://www.mathworks.com/help/nnet/ref/augmentedimagedatastore.html according to this, that can be done. I have updated to 2018a, and this works, thanks

Sign in to comment.

Categories

Find more on Image Data Workflows in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!