I am importing JSON data from a public database URI http://data.seattle.gov/api/views/3k2p-39jp/rows.json and the rows go as far as 445454. Using the following code I am constructing the JSON object of the entire data.
HttpGet get = new HttpGet(uri);
HttpClient client = new DefaultHttpClient();
HttpResponse response = client.execute(get);
BufferedReader reader = new BufferedReader(new InputStreamReader(response.getEntity().getContent(), "UTF-8"));
StringBuilder builder=new StringBuilder();
for(String line=null;(line = reader.readLine()) != null;){
builder.append(line).append("\n");
}
JSONTokener jsonTokener=new JSONTokener(builder.toString());
JSONObject finalJson=new JSONObject(jsonTokener);
JSONArray data=finalJson.getJSONArray("data");
Because the data is too large, i am getting 03-21 03:41:49.714: E/AndroidRuntime(666): Caused by: java.lang.OutOfMemoryError
pointing the source of error at buildr.append(line).append("\n")
. Is there anyway I can handle large datasets without getting memory allocation issues?