On the iPad, I'm trying to build a simple app that continuously tracks the pointer's coordinates as it moves across the screen (via external mouse/trackpad). Basically something like this JavaScript example @ 4:13 except in a SwiftUI view on the iPad.
MacOS has NSEvent.mouseLocation, but it doesn't seem like there's an iPadOS counterpart. Every resource I've come across online necessarily associates coordinates with a gesture (rotation, pinch, drag, click, etc.) with no way to respond to only cursor movement. This leads me to believe that the solution for pure pointer movement is likely independent of the Gesture protocol.
I was able to get halfway there after modifying code from this SO post. My code below displays updated mouse coordinates so long as the pointer is dragging (i.e., while at least one button is pressed).
import SwiftUI
struct ContentView: View {
@State private var pt = CGPoint()
@State private var txt = "init"
var body: some View {
let myGesture = DragGesture(
minimumDistance: 0,
coordinateSpace: .local
)
.onChanged {
self.pt = $0.location
self.txt = "x: \(self.pt.x), y: \(self.pt.y)"
}
// Spacers needed to make the VStack occupy the whole screen
return VStack {
Spacer()
Text(self.txt)
Spacer()
}
.frame(width: 1000, height: 1000)
.border(Color.green)
.contentShape(Rectangle()) // Make the entire VStack tappabable, otherwise, only the area with text generates a gesture
.gesture(myGesture) // Add the gesture to the Vstack
}
}
Now, how do I achieve this effect without needing to drag? Could there be some way to do this with the help of objective-C if a pure Swift solution isn't possible?
Thanks
Edit: