Home > Software design >  Writing json data to text file and creating new file dynamically when size limit is reached for firs
Writing json data to text file and creating new file dynamically when size limit is reached for firs

Time:06-29

I have a spring-boot application which is constantly writing JSON data to text file in a given location. Now, I also want to create new text files dynamically, once the size limit is reached.

I was thinking of handling above things manually with code something like this

    for (int i = 1; i < 1000; i  ) {
        try {
            File f = new File(FILE_LOCATION   fileName);
            fileSize = Files.size(f.toPath());
            System.out.println("filesize: "   fileSize);

            if (f.exists()) {
                if (fileSize > SIZE_1KB) {
                    writer = new FileWriter(FILE_LOCATION   "00"   i   ".txt");
                    fileName = "00"   i   ".txt";
                } else {
                    writer = new FileWriter(FILE_LOCATION   fileName, true);
                }
            } else {
                writer = new FileWriter(FILE_LOCATION   fileName);
            }

            // use writer to write data

            if (f.exists()) {
                for (int j = 0; j < 10; j  )
                    writer.append(UUID.randomUUID().toString());
            } else {
                for (int j = 0; j < 10; j  )
                    writer.write(UUID.randomUUID().toString());
            }

        } catch (Exception e) {
            e.printStackTrace();
        }
    }

This is just a piece of code that I created to post the question here. But, the logic of writing data to file is similar to the one I was using in spring-boot application.

Also, if I want to handle things manually, I'll have to handle lot of things myself apart from creating new files when size limit is reached. Things like moving files to archived folder, when date if changed, deleting older files from archived folder when size limit of archived folder is reached, etc.

I've also looked into logback.xml configuration, but that's already being used for rolling out log files. So, I don't think I can use that here.

At this point, I feel like there might be a better way to do all this instead of handling it on our own manually. If anyone can suggest any library or framework or anything, it will be a great help.

Edit: Okay, have come across rotating-fos. Trying to determine appropriate configuration, which can meet my general requirements of rotating based on size, date, deleting old records and when size limit reached.

Edit 2: I've currently used rotating-fos library to achieve all the things mentioned above except deletion of files. Since, in my use case, text files are pushed to a data pipeline via separate procedure, which will again take care of removal of those files.

private void processEntry(Map<String, Object> entry) {
    try {
        String path = jsonProcessingPath   "/"   CALL_DATA   "/";
        SimpleDateFormat format = new SimpleDateFormat(DATE_FORMAT);
        File dir = new File(path   format.format(new Date()));

        if (!dir.exists())
            dir.mkdir();

        String fileName = CALL_DATA_FILE_NAME;
        File file = new File(dir.getAbsolutePath()   "//"   fileName);
        if (!file.exists())
            file.createNewFile();

        RotationConfig config = RotationConfig.builder().file(file.getAbsolutePath())
                .filePattern(dir   "/"   CALL_DATA_FILE_NAME   ".%d{HHmmss}.txt")
                .policy(new SizeBasedRotationPolicy(Long.parseLong(SIZE_LIMIT)))
                .policy(DailyRotationPolicy.getInstance()).build();
        RotatingFileOutputStream outputStream = new RotatingFileOutputStream(config);

        Gson gson = new Gson();
        String json = gson.toJson(entry);
        int currentId = gson.fromJson(json, JsonObject.class).get(ID).getAsInt();

        log.debug("Writing entry of report_data_calls table");
        outputStream.write(json.getBytes());
        outputStream.close();

        exampleSchedulerService.updateJobDataMapInfo(ProcessCallDataJob.class, CALL_DATA_LAST_ID, currentId);
        log.debug("Updated last processed call id to be: "   currentId);
    } catch (Exception e) {
        log.error("error: {}", e);
    }
}

CodePudding user response:

I think rotating-fos could be a good solution.
Anyway if you want to do it manually you can improve your code by combining two classes (Logger and FileHandler) from the java.util.logging package.

I wrote a little PoC, try to play around with it if you think that can be helpful.

public static void main(String[] args) throws IOException {
    FileHandler fileHandler = new FileHandler("the_log.json", 100, 100, true);
    fileHandler.setFormatter(new Formatter() {
      @Override
      public String format(LogRecord record) {
        // here you could pretty print the content or doing elaborations on the content...
        return record.getMessage();
      }
    });

    Logger jsonLogger = Logger.getLogger("MyJsonLogger");
    jsonLogger.setLevel(Level.ALL);
    jsonLogger.addHandler(fileHandler);

    for (int i = 0; i < 999; i  ) {
      jsonLogger.log(Level.ALL, i   "\n");
    }
}

Please note that the FileHandler input parameters are a little bit tricky, here's the docs.

CodePudding user response:

JSON files should be in UTF-8 encoding. FileWriter is a convenience class using the default operating system encoding. So non-portable. For UUIDs (ASCII) that would function, but it is a ticking time bomb.

Just appending UUIDs misses spacing or such.

writer.append(UUID.randomUUID().toString()).append("\r\n");

You can use a logger for rolling file appenders. The hander can be tied to your class, so there is no interference.

  • Related