CodePudding user response:
Single machine processing large size was the only way of getting data block processing, program before running you simply need to calculate the space complexity, 1045882404 b=997 MB, show your program tried to apply for so many memory at a time, but the system to provide no, 1 in memory more machine run, 2. Modify the program, to block processing, similar to tile processing, of course, can block itself depends on the algorithm, the vast majority of algorithms can be partitioned, does not require all data exist in memory at the same time, the realization, depends on the writers' ability, another kind of memory is not due to the big data to be processed, but the intermediate variable consume large amounts of memory, need to deal with, in a word, using off-the-shelf software, memory in the computer; Write their own procedures, change of memory algorithm, is of two kinds of resources in space and time,