I've got a problem which requires me to keep track of a large amount of data in a hard to predict way. I do know a few rules however that allow me to determine when some data is not needed anymore.
I was thinking of using a containers.Map
to store the data as it is produced in an efficient way (as I cannot specify a large enough empty matrix to hold all my data).
I would then run my code and at each step, if the conditions are met, clear some of the stored data using remove(M, keys)
.
However I've run a very basic test and the memory allocated to the Map doesn't seem to change when I remove elements. Here is the test I ran:
testmap = containers.Map([1,2,3],[11,22,33])
whos testmap % outputs a size of 3x1 as expected and an allocation of 8 Bytes
remove(testmap,[1]);
whos testmap % outputs a size of 2x1 as expected ... but still an allocation of 8 Bytes
Am I looking at the right thing here? Is there another way to proceed to have a data container which can be dynamically adjusted in terms of memory usage?
Thanks.
CodePudding user response:
Note that even an empty map is 8 bytes, so your test isn't telling you anything
testmap = containers.Map();
whos testmap
% Name Size Bytes Class Attributes
% testmap 0x1 8 containers.Map
The container itself takes up very little memory, we have to take a lead from this question to determine the size of the associated map data
testmap = containers.Map(1:10,1:10);
testmap_s = struct(testmap);
whos testmap_s
% Name Size Bytes Class Attributes
% testmap_s 1x1 3809 struct
remove(testmap,{1,2,3,4,5,6,7,8});
testmap_s = struct(testmap);
whos testmap_s
% Name Size Bytes Class Attributes
% testmap_s 1x1 2017 struct
So yes, this should work for your application, the data will be cleared from the map, and pending any additional memory management by MATLAB should be "forgotten" in due course.