I'm receiving the following OutOfMemoryError exception try to page through responses from an API
Caused by: java.lang.OutOfMemoryError: Failed to allocate a 37748744 byte allocation with 25165824 free bytes and 24MB until OOM, target footprint 268261936, growth limit 268435456 at java.util.Arrays.copyOf(Arrays.java:3257) at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124) at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:649) at java.lang.StringBuilder.append(StringBuilder.java:203) at com.richmondgamesstudio.inattaxonomytreeviewer.utils.INatCall.readAll(INatCall.java:105) at com.richmondgamesstudio.inattaxonomytreeviewer.utils.INatCall.restCalliNat(INatCall.java:56) at com.richmondgamesstudio.inattaxonomytreeviewer.utils.INatCall.restCalliNat(INatCall.java:75) at com.richmondgamesstudio.inattaxonomytreeviewer.utils.INatCall.restCalliNat(INatCall.java:75) at com.richmondgamesstudio.inattaxonomytreeviewer.utils.INatCall.restCalliNat(INatCall.java:75) at com.richmondgamesstudio.inattaxonomytreeviewer.utils.INatCall.doInBackground(INatCall.java:42)
Is there a more effective way of handling a large set of data through rest? My app calls an API that returns a user's 'entries'. The API's max page count is 200. Most users will have more than 200 entries, and this has been fine for the most part. My app can page and account for that. However, there are some users that will have 2000 entries and my app is running out of memory trying to iterate through these larger sets. My questions are:
- Is there a way to increase the amount of memory my android app can use?
- What are some ways that I can optimize the below code to use less memory?
Recursive rest call
private JSONArray restCall(URL url, JSONArray results) {
Log.d(TAG, "restCalliNat: Start");
InputStream is = null;
int totalResults = 0;
int perPage = 0;
JSONArray newResults = new JSONArray();
try {
is = url.openStream();
BufferedReader rd = new BufferedReader(new InputStreamReader(is, Charset.forName("UTF-8")));
String jsonText = readAll(rd); //<-- LINE 56
is.close();
rd.close();
JSONObject json = new JSONObject(jsonText);
newResults = json.getJSONArray("results");
totalResults = (int) json.get("total_results");
perPage = (int) json.get("per_page");
} catch (IOException | JSONException e) {
e.printStackTrace();
return null;
}
if (results != null) {
newResults = concatArray(results, newResults);
}
if (totalResults > page*perPage) {
newResults = restCall(updatePageCount(url), newResults);
}
return newResults;
}
At each new page, the new page gets concatenated until I have all the entries.
private JSONArray concatArray(JSONArray arr1, JSONArray arr2) {
JSONArray result = new JSONArray();
try {
for (int i = 0; i < arr1.length(); i ) {
result.put(arr1.get(i));
}
for (int i = 0; i < arr2.length(); i ) {
result.put(arr2.get(i));
}
} catch (JSONException e) {
e.printStackTrace();
}
return result;
}
Converts API response to String
private static String readAll(Reader rd) throws IOException {
StringBuilder sb = new StringBuilder();
int cp;
while ((cp = rd.read()) != -1) {
sb.append((char) cp); //<--- LINE 105
}
return sb.toString();
}
CodePudding user response:
and my app is running out of memory
That is not exactly what the error says: Caused by: java.lang.OutOfMemoryError: Failed to allocate a 37748744 byte allocation with 25165824 free bytes and 24MB until OOM
. You have 24MB of free memory. However, you are trying to allocate a 36MB single buffer, and that is not going to work. Even if you had 124MB or 524MB of free memory, you might not have access to a single 36MB contiguous block of free memory.
Converts API response to String
Either:
- Request far fewer items from your REST Web service at a time (instead of 200, try 5); or
- Stop trying to read the entire API response into a string, as your API response is very large
You might want to try more modern solutions for accessing a REST-style Web service. For example, Retrofit is rather popular and would not require you to read the entire raw JSON into a single string. Even if you wanted to stick with the openStream()
-on-a-URL
approach, modern JSON parsers (Moshi, Gson, or even the JsonReader
in the Android SDK) are streaming parsers and would not require you to read the entire raw JSON into a single string.
CodePudding user response:
Turns out the authority that I was calling had a limit to how many pages you can request.
per_Page count limit was 200 results for page. But the page_count limit was page * per_Page > 1000. So it didn't matter what the per_page count was. If the page * per_Page > 1000 than it would cancel the call. Even in the middle of a stream.
CodePudding user response:
Add in your manifest android:hardwareAccelerated="false" , android:largeHeap="true" it is working for some situations,
<application
android:allowBackup="true"
android:hardwareAccelerated="false"
android:largeHeap="true"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:supportsRtl="true"
android:theme="@style/AppTheme">