0

So I have a project where I take an image, display it and depending on where you tap it gives back the rgb values.

However, the display on the iphone is much smaller than the image resolution so the image gets scaled down.

I tried to circumnavigate this by multiplying the coordinates of the tapLocation on the UIImageView by: image.x/imageview.x and image.y/imageview.y respectively.

But still the colors are way off.

My code:

@IBAction func imageTap(_ sender: UITapGestureRecognizer) {
    if sender.state == .ended {
        let location = sender.location(in: imageDisplay)
        let widthFactor = image.size.width / imageDisplay.frame.width
        let heightFactor = image.size.height / imageDisplay.frame.height

        let scaledWidth = location.x * widthFactor
        let scaledHeight = location.y * heightFactor
        let scaledLocation = CGPoint(x: scaledWidth, y: scaledHeight)

        let colorAtLocation = image.getPixelColor(pos: scaledLocation)
        let rgbValues = colorAtLocation.rgb()

        let rValue = rgbValues!.red
        let gValue = rgbValues!.green
        let bValue = rgbValues!.blue

        redValue.text = "\(String(describing: rValue))"
        greenValue.text = "\(String(describing: gValue))"
        blueValue.text = "\(String(describing: bValue))"

        colorViewer.backgroundColor = colorAtLocation
    }
}

How should I calculate coordinate correctly?

Possible places where this could go wrong:

  1. The 0;0 isn't where I think it is
  2. The UIImageView's Content Mode shouldn't be Aspect fit
  3. The image scaling isn't as linear as I taught This is all I could think of but how would I go to check these?
user6879072
  • 441
  • 1
  • 7
  • 17
  • What color do you *want* to get? If you have a 1500x1500 image and you're showing it in a 150x150 view (1/10th), the pixels you see on the screen are obviously not the same as the pixels in the image. So... if you tap at x:20 y:20, do you want the color from 20,20 in the *view* or 200,200 in the image? – DonMag Jun 29 '17 at 18:12
  • I want the colour from 20;20 from the UIImageView, but I taught that that is equal to 200;200 on the image. My code in this case would return the color at 200;200 on the image – user6879072 Jun 29 '17 at 18:31
  • OK - imagine you have a 400x400 image, displayed in a 200x200 image view - exactly 1/2 the size. If the image consists of single-pixel vertical lines, alternating between red and white, what do you get in the 200x200 view? Do you get solid red - so, every-other pixel? Or do you get a "blend" of the red and white adjacent pixels? And, which do you *really* want as a result of the tap? – DonMag Jun 29 '17 at 18:50
  • Is this a trick question? I don't know how the picture would appear. I guess it would be the blend, and I want to get the blend of colours because that is what we perceive with our eyes. – user6879072 Jun 29 '17 at 19:03
  • 1
    OK - you want to use one of the answers (the one from [Mark Moeykens](https://stackoverflow.com/users/2209965/mark-moeykens) works well) in this post: https://stackoverflow.com/questions/12770181/how-to-get-the-pixel-color-on-touch – DonMag Jun 29 '17 at 19:16

0 Answers0