0

I have an app with a version for iOS and a version for android. The app sends a image (jpeg in this case) from the gallery or camera to a server using a multipart upload.

The images start life identical in size, resolution and file size, and when I run an imagemagick compare they are identical.

When they reach the server, they look identical to the naked eye, have identical numbers of pixels and dimensions, both 320 x 483 with a resolution of 72 pixels. However, they have a different filesize, despite loading each with no compression specified. When I run imagemagick compare they are obviously different.

These are the original images:

Image on iPhone

Image on Android

These are the images when uploaded:

Image uploaded from iPhone

Image uploaded from Android

imagemagick compare image

IOS Code (using AFnetworking)

UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
NSData *imageData = UIImageJPEGRepresentation(img, 1.0);
UIGraphicsEndImageContext(); ...
[requestSerializer setValue:@"multipart/form-data" forHTTPHeaderField:@"content-type"];

NSMutableURLRequest *request = [requestSerializer multipartFormRequestWithMethod:@"POST" URLString: ServerPath
                    parameters:sendDictionary constructingBodyWithBlock:^(id<AFMultipartFormData> formData) {

    [formData appendPartWithFileData: imageData name:@"file" fileName:@"temp.jpeg" mimeType:@"image/jpeg"];

} error:nil];

Android Code (using loopj for Async request)

    RequestParams params = new RequestParams(); 

    params = new RequestParams();

    //use bytearray
    BitmapFactory.Options BMoptions = new BitmapFactory.Options();
    BMoptions.inSampleSize = 1; // Example, there are also ways to calculate an optimal value.

    InputStream in;
    try {
        in = getContentResolver().openInputStream(Uri.fromFile(new File(Globals.getImagePath())));

        Bitmap bitmap = BitmapFactory.decodeStream(in, null, BMoptions);

        ByteArrayOutputStream baos = new ByteArrayOutputStream();
        bitmap = scaleToActualAspectRatio2(bitmap);

        //
        bitmap.compress(Bitmap.CompressFormat.JPEG, 100, baos);
        bitmapdata = baos.toByteArray();

        params.put("file", new ByteArrayInputStream(bitmapdata), "androidMobile.jpg");

    } catch (FileNotFoundException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }           

I presume that it is the pixel intensities that makes the difference.

Can someone explain why the images arrive different? Do I have to save the Android image to a file first and upload it as a file?

bugman
  • 44
  • 3
  • You could always compare the bytes to see what has changed. – Barns May 18 '18 at 21:49
  • The resizing and upload of your iOS image seems to have added a white border. This may be due to the resampling during the resizing and then rounding or truncating to the nearest whole pixel. The last pixel in width when resizing according to the heights is probably not a full pixels. E.G. (483/640)*423=319.2328125 But the image is filled to a width of 320. 319.2 is probably padded with white to 320 by your iOS code; whereas it is interpolated or extend with image pixel data from the last column(s) in your Android code. Perhaps there is a virtual-pixel setting that would mitigate that in iOS. – fmw42 May 18 '18 at 23:41
  • I can understand what you say about the border but there seems to be a more global change. Do you think it is a shift in the position of the pixels due to the padding? – bugman May 19 '18 at 08:34

1 Answers1

0

Under Android you are not uploading your jpeg file.

Instead you mess around with converting your jpeg file to a bitmap and compressing that bitmap to some bytes which you will finally upload.

No wonder that you end up with different bytes.

Do away with the bitmap. Just upload the bytes of the jpg file directly.

 params.put("file", new ByteArrayInputStream(bitmapdata), "androidMobile.jpg");

Change to

 params.put("file", in, "androidMobile.jpg");

Remove the bitmap code directly as that input stream can only be read once.

greenapps
  • 11,154
  • 2
  • 16
  • 19
  • Yes, it seems like a likely answer and I was planning trying that next. If I upload an image from the camera it means do I have to save it to the file system first or fish around for the supposed located? – bugman May 19 '18 at 08:32
  • If you use the camera intent in the right way the camera app will save the image to the file you indicated and then you already know which file it is. – greenapps May 19 '18 at 08:35