Home > Software design >  Java.lang.OutOfMemoryError: null while writing on BufferedOutputStream
Java.lang.OutOfMemoryError: null while writing on BufferedOutputStream

Time:12-09

I am trying to read data from a snowflake resultset and construct a line by appending all the columns as per my requirement. I have 1300 columns to read from the resultset. I wrote the below approach.

int available  = -1;
while (rs.next()) {
    for ( int i=1; i <= 1300; i  ) {
        String value = rs.getString(i);
        if(value == null)
            value = "";
        row.append(value).append("\t");
    }
    byte[] line = (row.toString().trim() "\n").getBytes(StandardCharsets.UTF_8);
    available = line.length;
    bufferedOutputStream.write(line, 0, available);
    row.setLength(0);
}
bufferedOutputStream.flush();
bufferedOutputStream.close();

This works till 100K records but fails when I tried to write 1 Million records throwing,

Java.lang.OutOfMemoryError: null
at java.base/java.io.ByteArrayOutputStream.hugeCapacity(ByteArrayOutputStream.java:125)
at java.base/java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:119)
at java.base/java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:95)
at java.base/java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:156)
at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:81)
at java.base/java.io.BufferedOutputStream.write(BufferedOutputStream.java:127)

Please correct me If I am wrong above. Basically I need to write 10 Million records without having memory issues. Need some help here, Thanks in advance.

Edit: I am writing this into ByteArrayOutputStream to read from ByteArrayInputStream as InputStream. From this input stream will read the data for a validation process.

CodePudding user response:

Remove the row.

try {
    while (rs.next()) {
        for (int i = 1; i <= 1300; i  ) {
            String value = rs.getString(i);
            if (value != null && !value.isEmpty()) {
                byte[] b = value.getBytes(StandardCharsets.UTF_8);
                bufferedOutputStream.write(b);
            }
            bufferedOutputStream.write('\t');
        }
        bufferedOutputStream.write('\n');
    }
} finally {
    bufferedOutputStream.close();
}

It could be that you hav CBLOB fields with huge texts. Then rs.getString could be replaced by a reader.

CodePudding user response:

For this I think you can try amend JVM options for more memory

-Xms # minsize of JVM heap

-Xmx # maxsize of JVM heap

for example :

Class Obj{
    int i;
    byte j;
    String s;
    }

emptpy object(8) int(4) byte(1) StringRef(4)=17byte,Object should be multiple of 8,so it‘s 24byte

1300 columns Object will be big java object, we suppose it’s 500 bytes, then

(500 * 1000000)/1024/1024 ≈ 480M

so may be you can try to set:

-Xms512 # minsize of JVM heap

-Xmx640 # maxsize of JVM heap

  • Related