Ask Your Question
0

Mat and imread memory management

asked 2016-07-26 05:13:11 -0600

laxn_pander gravatar image

updated 2016-07-26 05:13:58 -0600

Hey there,

I am looking for the "best practice" to save a lot of images of type cv::Mat in the cache. Is it okay if I just push_back them in a vector< Mat > and get back as soon as I need them? Reason for my question is, that I tried to load ~150 images (300 Mb) with imread and after 100 the system starts to slow down extremely. After having a look at the monitoring I noticed that the RAM of 5 GB is getting trashed until it breaks down. Code snippet for my image reading below:

    cout << "Start reading image inputs..." << endl;

vector<Mat> imagesArg;
for (int i = 1; i < argc; i++) {
    Mat img = imread(argv[i]);
    if (argc == 1){
        cout << "Not enough image data." << endl;
    }
    if (img.empty()) {
        cout << "Can't read image " << argv[i] << "." << endl;
        return 1;
    }
    imagesArg.push_back(img);
    img.release();
}
cout << "Finished reading " << imagesArg.size() << " images." << endl;

Thanks in advance for an answer!

Lax

edit retag flag offensive close merge delete

Comments

have you tried to declareMat img outside the for loop?

bjorn89 gravatar imagebjorn89 ( 2016-07-26 11:25:46 -0600 )edit
1

When you do imagesArg.push_back(img) you push back a pointer to the local variable Mat img which soon goes out of scope. If you want to save the data you should imagesArg.push_back(img.clone()) instead. Also because img already goes out of scope on the next iteration you don't have to release it

strann gravatar imagestrann ( 2016-07-26 12:25:32 -0600 )edit

@bjorn89: Doesn't change anything :< @strann: But my problem is not that my data goes out of scope, my cache is flooded until the system stops working? Plus I don't think you are right. It may be a push_back of a local variable, but cv::Mat is smart enough to realize it can be called by imagesArg.

laxn_pander gravatar imagelaxn_pander ( 2016-07-26 12:36:53 -0600 )edit
3

300mb on Disc can easily expand to several GB of images in memory

berak gravatar imageberak ( 2016-07-27 00:44:44 -0600 )edit

@berak: Hm okay, but do 100 images sound too much for a RAM to handle for you (4000 x 3000 px)? And 300 mb expanding to 3 GB?

laxn_pander gravatar imagelaxn_pander ( 2016-07-27 01:02:18 -0600 )edit
2

do the maths:

4000 * 3000 * 3 * 100 = 3600000000

3.6 gb. enough, to get your os into trouble. (how much ram is there ? your webbrowser will already eat half of it..)

berak gravatar imageberak ( 2016-07-27 01:18:09 -0600 )edit
2

Point for you. Sometimes I lack the basic skills. :D I got 5.8 gb of rams following the system monitor. The 3.6 Gb would be enough to kill it.

laxn_pander gravatar imagelaxn_pander ( 2016-07-27 01:31:06 -0600 )edit
  • do you really need to preload all of the images ?
  • 3000x4000 seems insanely large. do you really need that resolution ? (what is it for ?) maybe resizing the images, beforeyou store them in your vector helps.
berak gravatar imageberak ( 2016-07-27 05:50:13 -0600 )edit

1 answer

Sort by ยป oldest newest most voted
2

answered 2016-07-26 20:09:22 -0600

There are some problems with your code:

  1. If you want to read a lot of images, do not use their names as command line arguments. Instead, put their names into a file and use that file name as one argument. Thus, your command line is shorter and neater and this helps to avoid the limitation of the number of command line arguments for a program.
  2. If your put a local Mat into an outside vector<mat>, you have to remove the img.release() statement since it frees the memory for the object that is being managed by the vector. Or, in case you keep the img.release() statement, you have to put each clone of one read image.
  3. If your images are of the same size and you know their number, you should initiate a vector<mat> using those parameters such as: vector<mat> imagesArg(300, cv::Mat(128, 128, CV_8U)); and then you can read each image as: imagesArg[i] = imread(imgNames[i], 0);

In fact, 300 images are not huge with a program and the idea of keeping images in a vector is not a good approach. You'd better read each image, process it, then store the results of all the images in a vector.

edit flag offensive delete link more

Comments

1) Good point, I will try this one as soon as I can! 2) The release statement was in fact just there to check, if it changes anything concerning the memory problem. I already removed it. :) 3) This one I will also check and let you know what happens!

Not saving them in some way would be a challenging task. I am writing a dynamic stitching process and at different points of the programm old data (first measurement for example) may be accessed. Reading it from the disk then might be not so optimal. Especially as you said: 100 images do not sound too much to me :S

However thanks, I will test your advice and give you feedback if it succeeded!

laxn_pander gravatar imagelaxn_pander ( 2016-07-27 00:58:54 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2016-07-26 05:13:11 -0600

Seen: 5,181 times

Last updated: Jul 26 '16