Home > Software engineering >  The efficient way to process data and save in database in java
The efficient way to process data and save in database in java

Time:11-19

Is there more efficient way of the processing data if a file's size is at least 100KB-800KB.

The way that I know only is about

  • get data from a file and store it in Java Object.
  • process the data.
  • save data in Database[Insert, Update] using For loops.

**In every loops, the Project will always close the database connection and re-connect in the start of new loops.

CodePudding user response:

I think your biggest concern should be reading and process data in memory if file size is big. Cause if file size is big then if you read the file a once and process those data then you will need a lot of memory.

For reading you should try to read as a stream then process those data and then finally write to memory.

You must not close database connection in each loop. Cause creating database connection is costly and retry to reuse existing connection.Close the connection when application exit.

CodePudding user response:

You need to stream in the data from a file, collect it into batches and insert each batch when it gets full into the database.

lets say XML(which gets very big)FileInputStream -> XmlPullParser -> Create Object -> add to list -> when list reaches a size threshold do a batch insert.

This truly depends on how big your data is per item, but for instance if you have an object with 12 fields ie 12 columns and you need to insert it. I figured you could do 500 items per batch per Gigabyte of total RAM. This scaled quite well for me from a device with 1gb running on Android 4.4.2 to a device with 4gb running on Android 10.

For benchmarking a 23Mb xml file with 80,000 items

It takes about ~3 minutes to insert on the Android 4.4.2 device.

It takes ~20 seconds to insert on the Android 10 device.

  • Related