2

I know this has been asked before, but since I haven't been able to find an answer with a definitive conclusion or at least one that shows the pros and cons of the possibles approaches, I have to ask :

When it comes to read data from the Internet, a webservice for instance, what is the correct or more efficient way to read this data?

From all the books I have glanced over, I've found at least 4 ways to read data:

1) Reading a specific amount of characters at a time. In this case the data is read in chunks of 4026 characters

BufferedReader reader = new BufferedReader(
new InputStreamReader(in, encoding));
char[] buffer = new char[4096];
StringBuilder sb = new StringBuilder();
int downloadedBytes = 0;
int len1 = 0;
while ((len1 = reader.read(buffer)) > 0) {  
    sb.append(buffer);  
}
return sb.toString();

2) Read the data knowing the content lenght

int length =(HttpURLConnection) urlConnection.getContentLength();
InputStream inputStream = urlConnection.getInputStream();
BufferedReader bufferedReader =new BufferedReader(new InputStreamReader(inputStream));
StringBuilder stringBuilder = new StringBuilder(length);
char[] buffer = new char[length];
int charsRead;
while ((charsRead = bufferedReader.read(buffer)) != -1) {
    stringBuilder.append(buffer, 0, charsRead);
}
return stringBuilder.toString();

3) Read the data line by line :

BufferedReader reader=new BufferedReader(new InputStreamReader(c.getInputStream()));
StringBuilder buf=new StringBuilder();
String line=null;
while ((line=reader.readLine()) != null) {
  buf.append(line);
}
return(buf.toString());

4) Read the data character by character:

InputStream in = mConnection.getInputStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(
        in, enconding));    
int ch;
StringBuilder sb = new StringBuilder();         
while ((ch=reader.read()) > 0) {            
    sb.append((char)ch);
}

return sb.toString().trim();

I have tried 3 of these 4 different techniques, except for number 3 (Read the data line by line ) and out of the three techniques only the fourth has given me good results.

  • The first method, didn't work for me because when I read large amounts of data, as it often cut the data giving me as a result invalid json strings or string with white spaces at the end.
  • The second approach, well I wasn't able to use that method because getContentLength is not always reliable and if the value is not set , there's nothing we can do about it , well that's my case.
  • I didn't tried the third method because I wasn't sure about the fact of reading data "line" by "line". Does this apply to data that contains an array of json objects or only to files that indeed contain lines??
  • Being the last technique the last choice I was left with, I tried it and it worked, BUT I don't think that reading a large amount of data character by character would be efficient at all.

So now I would really appreciate your opinions and ideas. What approach do you use when it comes to reading data from webservices? and more importantly why?

Thanks.

P.D. I know I could've easily used DefaultHttpClient, but the doc clearly encourages not to do so.

For Android 2.3 (Gingerbread) and later, HttpURLConnection is the best choice. Its simple API and small size makes it great fit for Android. Transparent compression and response caching reduce network use, improve speed and save battery.

eddy
  • 4,373
  • 16
  • 60
  • 94

3 Answers3

2

Ive tried all the methods that you have mentioned. One problem if faced was the reply not being read completely. After some research, the most efficient/fastest way i found was to go about it like this

       DefaultHttpClient client = new DefaultHttpClient();
                HttpGet httpGet = new HttpGet(url);

                httpGet.setHeader("Accept", "application/json");
                httpGet.setHeader("Content-type", "application/json");
//ive put json header because im using json


                try {
                    HttpResponse execute = client.execute(httpGet);


                    String responseStr = EntityUtils.toString(execute.getEntity());

                 }

responseStr will contain the webservice reply and it reads it in one go. Hope this helps

ZInj
  • 341
  • 2
  • 9
  • 2
    Thank you @ZInj for taking the time to answer. I know that DefaultHttpClient provides an easier way to read/write data, but the official doc says this : `For Android 2.3 (Gingerbread) and later, HttpURLConnection is the best choice. Its simple API and small size makes it great fit for Android. Transparent compression and response caching reduce network use, improve speed and save battery` http://goo.gl/4zyIki – eddy Sep 16 '14 at 11:22
2

If the data volume is not too big, it doesn't really matter, what approach you use. If it is, then it makes sense to use buffering - and read data in chunks.

2nd approach is not too good, as you not always can get ContentLength.

Then, if your data is text/html/JSON you can use 3rd approach, as you don't have to bother yourself with the chunk size. Also, you can print the incoming data line-by-line to aim debugging.

If your data is a binary/base64 stream like image, you should use 1st approach and read data in 4k (usually used) blocks.

UPDATE:

BTW, instead of the dreaded DefaultHttpClient I'm using the AndroidHttpClient as a singleton and it works smooth :)

injecteer
  • 20,038
  • 4
  • 45
  • 89
  • So reading the response line by line is not limited to files that actually have a new line character? – eddy Sep 16 '14 at 11:54
  • "instead of the dreaded DefaultHttpClient I'm using the AndroidHttpClient" - it's better to use HttpUrlConnection – Alexander Zhak Sep 16 '14 at 11:57
  • they *should* have line-breaks, otherwise you end up with buffering the whole response body in memory. That's why line-by-line reading makes sense for human-readable files: they usually DO have line-breaks – injecteer Sep 16 '14 at 11:58
  • So reading a response that contains two or three different arrays of json objects is not possible with readLine() – eddy Sep 16 '14 at 12:22
  • it IS possible, the question is rather how effectively can the stream be buffered. If you know, that the JSON is not formatted (no line-breaks) it's better to use chunks of the definite size – injecteer Sep 16 '14 at 12:26
  • @injecteer I've just tested using the third method, and though it worked, it read all the response in one shot. For a small amounts of data, I think that OK, but with large amount...uhmm I'm not so sure :S – eddy Sep 16 '14 at 20:01
  • this is exactly what I wrote :) – injecteer Sep 16 '14 at 20:27
0

It matters. Best for performance is to read from InputStream into a buffer of a reasonable size. This way you transfer a decent amount of data at one time, rather then repeating the same operation thousand times. Do not always rely on Content-length header value. For gzipped content it might show incorrect size.

Alexander Zhak
  • 9,140
  • 4
  • 46
  • 72
  • I tried something like this `char[] buffer = new char[16384];` but I always got strange results with undesirable characters at the end of the string – eddy Sep 16 '14 at 12:00
  • if you use InputStreamReader's read() method, then you should check if it returns -1, which means there's nothing more to read – Alexander Zhak Sep 16 '14 at 12:14
  • I know and that's exactly what I was doing as shown in my question `reader.read(buffer)) > 0` – eddy Sep 16 '14 at 13:55