So I have a project where I take an image, display it and depending on where you tap it gives back the rgb values.
However, the display on the iphone is much smaller than the image resolution so the image gets scaled down.
I tried to circumnavigate this by multiplying the coordinates of the tapLocation on the UIImageView by: image.x/imageview.x and image.y/imageview.y respectively.
But still the colors are way off.
My code:
@IBAction func imageTap(_ sender: UITapGestureRecognizer) {
if sender.state == .ended {
let location = sender.location(in: imageDisplay)
let widthFactor = image.size.width / imageDisplay.frame.width
let heightFactor = image.size.height / imageDisplay.frame.height
let scaledWidth = location.x * widthFactor
let scaledHeight = location.y * heightFactor
let scaledLocation = CGPoint(x: scaledWidth, y: scaledHeight)
let colorAtLocation = image.getPixelColor(pos: scaledLocation)
let rgbValues = colorAtLocation.rgb()
let rValue = rgbValues!.red
let gValue = rgbValues!.green
let bValue = rgbValues!.blue
redValue.text = "\(String(describing: rValue))"
greenValue.text = "\(String(describing: gValue))"
blueValue.text = "\(String(describing: bValue))"
colorViewer.backgroundColor = colorAtLocation
}
}
How should I calculate coordinate correctly?
Possible places where this could go wrong:
- The 0;0 isn't where I think it is
- The UIImageView's Content Mode shouldn't be Aspect fit
- The image scaling isn't as linear as I taught This is all I could think of but how would I go to check these?