Home > Net >  C # handle big data, memory, have what good method?
C # handle big data, memory, have what good method?

Time:01-13

Application:
There are a group of data in about 100 million or so, each time from the beginning to the end of this article 100 million data processing time, the analysis results, the current approach is one-time loading data in c # of memory, every time analysis without overloading data, ran about three minutes,
Special requirements:
Data with time sequence, which must be carried out in accordance with the time series of processing,

Because the data every day, incremental, afraid of later is more and more big, the memory outweighs,

Want to ask you, what good method, such as using redis, or distributed,


Goal:
In the case of no loss of performance, is there a better architecture,

Attempt to:
1 all the data is loaded first redis (this part is not time), each analysis from redis read program app takes a long time, (not possible)

I don't have much points,

CodePudding user response:

Data volume is very big, modified algorithm, instead of current treatment.
  •  Tags:  
  • C#
  • Related