Home > other >  With spark small data processing efficiency is very low?
With spark small data processing efficiency is very low?

Time:09-22

Now deal with a few megabytes size with the spark has about 10000 data, each simulation will traverse the 10000 data, to simulate 100000 times now, took a total of 2 h, but the use of single thread only need 2 minutes to simulate 100000 times, if the spark is not suitable for processing algorithm of this kind of situation?

CodePudding user response:

It should be said that the spark is more suitable for iterative computing scene

CodePudding user response:

The Spark is more suitable for large-scale data
  • Related