0

I'm working on image processing framework and use this code to read RGB data:

if let data = image.cgImage?.dataProvider?.data {           
  let dataPtr: UnsafePointer<UInt8> = CFDataGetBytePtr(data)
  let width = Int(image.size.width)
  let height = Int(image.size.height)
  for y in 0..<height {
    for x in 0..<width {
      let pixelInfo: Int = ((width * y) + x) * 4
      let r = dataPtr[pixelInfo]
      let g = dataPtr[pixelInfo + 1]
      let b = dataPtr[pixelInfo + 2]
      print("\(r), \(g), \(b)")
    }
  }
}

The if I create new Swift project and new Objective-C project and use the same code (using bridge header file for Objc project) I get different results, for example:

5, 36, 20;   24, 69, 48 (Swift)
5, 36, 18;   21, 69, 47 (Objc)

It causes much different results in further processing. I've tried to use Objective-C code and read data with CGBitmapContextCreate() but I get exact same result. It shows same ColorSpace in both apps, I've tried to set it manually to DeviceRGB and sRGB without any luck.

I have to match Objc output with Android app that has exact same results as Swift app.

UPDATE. Second solution that I've tried is to write another code for Objective-C and it returns exact same result that doesn't match Swift:

size_t bytesSize = 0;
unsigned char *bytes = [self getBytesFromImage:image dataSize:&bytesSize];

size_t doubleSize = sizeof(double) * bytesSize;
double *doubles = (double *)malloc(doubleSize);

size_t doublesIndex = 0;
size_t counter = 0;
while (counter < bytesSize) {
    unsigned char r = bytes[counter];
    unsigned char g = bytes[counter+1];
    unsigned char b = bytes[counter+2];
    counter += 4;
}

- (unsigned char*) getBytesFromImage:(UIImage *)image dataSize:(size_t *)dataSize {
    *dataSize = size_t(4 * image.size.width * image.size.height);
    unsigned char *imageData = (unsigned char*)malloc(*dataSize);
    
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    
    CGImageRef imageRef = [image CGImage];
    CGContextRef bitmap = CGBitmapContextCreate( imageData, image.size.width, image.size.height, 8, image.size.width * 4 , colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGContextDrawImage(bitmap, CGRectMake(0, 0, image.size.width, image.size.height), imageRef);
    
    CGContextRelease(bitmap);
    CGColorSpaceRelease(colorSpace);
    
    return imageData;
}
Andrey Soloviev
  • 412
  • 3
  • 8
  • What does "run this code in Objective-C and Swift apps" mean? – matt Oct 26 '20 at 05:42
  • Does the original data have a colour space? (not the output colour space). Note: the differences are minimal, it could be just two different numerical methods to convert original data in RGB [working with integers, we often have shortcuts methods] – Giacomo Catenazzi Oct 26 '20 at 08:33
  • You show (a tiny portion of) the Swift side - also show the Objective-C side. – skaak Oct 26 '20 at 14:31
  • @skaak I've added more code that prints out R, G and B. – Andrey Soloviev Oct 28 '20 at 04:07
  • @GiacomoCatenazzi I've tried images without color space and with "sRGB IEC61966-2.1”. – Andrey Soloviev Oct 28 '20 at 04:09
  • @matt that means same code used in application based on Objective-C code with bridge and pure Swift app. – Andrey Soloviev Oct 28 '20 at 04:10
  • I think I see the problem - you are reading the data directly from the image on the Swift side but on Objective-C side you create a new context and colourspace and draw the image in there and then return that. Seems you are looking for trouble ... why not do the same on both sides. As on Swift side, get image data via CGImage and data provider on Objective-C side as well and you will have a happy life. – skaak Oct 28 '20 at 05:36
  • @skaak Objective-C code that I've posted is another solution that I've tried. As I mentioned earlier I use Swift code with bridge header file and still get different results, then I tried to use CGBitmapContextCreate in Objective-C code without any luck. So again, if I create new Swift project and use my Swift code it gives me one set of data. Then I create another new project (Objective-C) and use same swift code with bridge header and it gives me different results… I’m lost.. – Andrey Soloviev Oct 28 '20 at 11:14

1 Answers1

0

Based on your Swift code, I coded what you see the below.

I used your code and this code and it seems to match perfectly - at least for an image I am using this side.

// Dump some bytes
- ( void ) test
{
    UIImage * img = [UIImage imageNamed:@"test"];
    CGDataProviderRef dp = CGImageGetDataProvider( img.CGImage );
    CFDataRef data = CGDataProviderCopyData ( dp );
    CFIndex length = CFDataGetLength ( data );
    NSUInteger i = 0;
    unsigned char rgba [ 4 ];

    // Print all
    // while ( i < length )
    // Print first few
    while ( i < 100 )
    {
        CFDataGetBytes( data, CFRangeMake( i, sizeof( rgba ) ), rgba );
        i += 4;

        NSLog ( @"RGB %d %d %d", rgba[ 0 ], rgba[ 1 ], rgba[ 2 ] );
    }
    CFRelease ( data );
    CGDataProviderRelease ( dp );
}
skaak
  • 2,988
  • 1
  • 8
  • 16
  • Thanks, it looks better than using CGBitmapContextCreate. Besides that I've found that problem somewhere else. I've created another Swift project, copy my code there and got same results as Objc, but another Swift project with exact same code return me different values. Will go through all build settings, may be something there. – Andrey Soloviev Oct 29 '20 at 07:08
  • It seems I run another Swift app in Simulator and that what gives me different values. – Andrey Soloviev Oct 29 '20 at 07:13
  • Is how it goes - hunt for something here and all the time it is hiding over there ... my code looks real clean but not tested that much, that other code I think you use if you want to change the colorspace of the bitmap. – skaak Oct 29 '20 at 07:21
  • My code has two release statements which I added as I think they should be there but if ever it gives you trouble then that is where to scratch for a solution. But let us hope for the best. – skaak Oct 29 '20 at 07:23