Apple provides some elegant code for managing pinch gestures in a UIKit environment, this can be downloaded directly from Apple. In this sample code you will see three coloured rectangles that can each be panned, pinched and rotated. I will focus mainly on an issue with the pinch gesture.
My problem arises when trying to make this code work in a mixed environment by using UIKit gestures created on a UIViewRepresentable's Coordinator that talk to a model class that in turn publishes values that trigger redraws in SwiftUI. Passing data doesn't seem to be an issue but the behaviour on the SwiftUI side is not what I expect.
Specifically the pinch gesture shows an unexpected jump when starting the gesture. When the scale is bigger this quirky effect is more notorious. I also noticed that the anchor position and the previous anchor position seem to be affecting this behaviour (but I'm not sure how exactly).
Here is Apple's code for a UIKit environment:
func pinchPiece(_ pinchGestureRecognizer: UIPinchGestureRecognizer) {
guard pinchGestureRecognizer.state == .began || pinchGestureRecognizer.state == .changed,
let piece = pinchGestureRecognizer.view else {
return
}
adjustAnchor(for: pinchGestureRecognizer)
let scale = pinchGestureRecognizer.scale
piece.transform = piece.transform.scaledBy(x: scale, y: scale)
pinchGestureRecognizer.scale = 1 // Clear scale so that it is the right delta next time.
}
private func adjustAnchor(for gestureRecognizer: UIGestureRecognizer) {
guard let piece = gestureRecognizer.view, gestureRecognizer.state == .began else {
return
}
let locationInPiece = gestureRecognizer.location(in: piece)
let locationInSuperview = gestureRecognizer.location(in: piece.superview)
let anchorX = locationInPiece.x / piece.bounds.size.width
let anchorY = locationInPiece.y / piece.bounds.size.height
piece.layer.anchorPoint = CGPoint(x: anchorX, y: anchorY)
piece.center = locationInSuperview
}
A piece
in Apple's code is one of the rectangles we see in the sample code. In my code a piece
is a UIKit object living in a UIViewRepresentable
, I call it uiView
and it holds all the gestures that it responds to:
@objc func pinch(_ gesture: UIPinchGestureRecognizer) {
guard gesture.state == .began || gesture.state == .changed,
let uiView = gesture.view else {
return
}
adjustAnchor(for: gesture)
parent.model.scale *= gesture.scale
gesture.scale = 1
}
private func adjustAnchor(for gesture: UIPinchGestureRecognizer) {
guard let uiView = gesture.view, gesture.state == .began else {
return
}
let locationInUIView = gesture.location(in: uiView)
let locationInSuperview = gesture.location(in: uiView.superview)
let anchorX = locationInUIView.x / uiView.bounds.size.width
let anchorY = locationInUIView.y / uiView.bounds.size.height
parent.model.anchor = CGPoint(x: anchorX, y: anchorY)
// parent.model.offset = CGSize(width: locationInSuperview.x, height: locationInSuperview.y)
}
The parent.model
refers to the model class that comes through an EnvironmentObject
directly into the UIViewRepresentable
struct.
In the SwiftUI side of things, ContentView
looks like this (for clarity I'm just using one CustomUIView
instead of the three pieces of Apple's code):
struct ContentView: View {
@EnvironmentObject var model: Model
var body: some View {
CustomUIView()
.frame(width: 300, height: 300)
.scaleEffect(model.scale, anchor: model.anchor)
.offset(document.offset)
}
}
As soon as you try to pinch on the CustomUIView
, the rectangle jumps a little as if it would not be correctly applying an initial translation to compensate for the anchor. The scaling does appear to work according to the anchor and the offset seems to be applied correctly when panning.
One odd hint: the initial jump seems to be going in the direction of the anchor but stays half way there, effectively not reaching the right translation and making the CustomUIView
jump under your fingers. As you keep on pinching closer to the previous anchor, the jump is less notorious.
Any help on this one would be greatly appreciated!