I have an app with a version for iOS and a version for android. The app sends a image (jpeg in this case) from the gallery or camera to a server using a multipart upload.
The images start life identical in size, resolution and file size, and when I run an imagemagick compare they are identical.
When they reach the server, they look identical to the naked eye, have identical numbers of pixels and dimensions, both 320 x 483 with a resolution of 72 pixels. However, they have a different filesize, despite loading each with no compression specified. When I run imagemagick compare they are obviously different.
These are the original images:
These are the images when uploaded:
IOS Code (using AFnetworking)
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
NSData *imageData = UIImageJPEGRepresentation(img, 1.0);
UIGraphicsEndImageContext(); ...
[requestSerializer setValue:@"multipart/form-data" forHTTPHeaderField:@"content-type"];
NSMutableURLRequest *request = [requestSerializer multipartFormRequestWithMethod:@"POST" URLString: ServerPath
parameters:sendDictionary constructingBodyWithBlock:^(id<AFMultipartFormData> formData) {
[formData appendPartWithFileData: imageData name:@"file" fileName:@"temp.jpeg" mimeType:@"image/jpeg"];
} error:nil];
Android Code (using loopj for Async request)
RequestParams params = new RequestParams();
params = new RequestParams();
//use bytearray
BitmapFactory.Options BMoptions = new BitmapFactory.Options();
BMoptions.inSampleSize = 1; // Example, there are also ways to calculate an optimal value.
InputStream in;
try {
in = getContentResolver().openInputStream(Uri.fromFile(new File(Globals.getImagePath())));
Bitmap bitmap = BitmapFactory.decodeStream(in, null, BMoptions);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bitmap = scaleToActualAspectRatio2(bitmap);
//
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, baos);
bitmapdata = baos.toByteArray();
params.put("file", new ByteArrayInputStream(bitmapdata), "androidMobile.jpg");
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
I presume that it is the pixel intensities that makes the difference.
Can someone explain why the images arrive different? Do I have to save the Android image to a file first and upload it as a file?