1

On the iPad, I'm trying to build a simple app that continuously tracks the pointer's coordinates as it moves across the screen (via external mouse/trackpad). Basically something like this JavaScript example @ 4:13 except in a SwiftUI view on the iPad.

MacOS has NSEvent.mouseLocation, but it doesn't seem like there's an iPadOS counterpart. Every resource I've come across online necessarily associates coordinates with a gesture (rotation, pinch, drag, click, etc.) with no way to respond to only cursor movement. This leads me to believe that the solution for pure pointer movement is likely independent of the Gesture protocol.

I was able to get halfway there after modifying code from this SO post. My code below displays updated mouse coordinates so long as the pointer is dragging (i.e., while at least one button is pressed).

import SwiftUI

struct ContentView: View {
    @State private var pt = CGPoint()
    @State private var txt = "init"

    var body: some View {
        let myGesture = DragGesture(
            minimumDistance: 0,
            coordinateSpace: .local
        )
        .onChanged {
            self.pt = $0.location
            self.txt = "x: \(self.pt.x), y: \(self.pt.y)"
        }


        // Spacers needed to make the VStack occupy the whole screen
        return VStack {
            Spacer()
            Text(self.txt)
            Spacer()
        }
        .frame(width: 1000, height: 1000)
        .border(Color.green)
        .contentShape(Rectangle()) // Make the entire VStack tappabable, otherwise, only the area with text generates a gesture
        .gesture(myGesture) // Add the gesture to the Vstack
    }
}

Now, how do I achieve this effect without needing to drag? Could there be some way to do this with the help of objective-C if a pure Swift solution isn't possible?

Thanks

Edit:

1 Answers1

-1

The WWDC video covers this very topic:

Handle trackpad and mouse input

Add SupportsIndirectInputEvents to your Info.plist

From the video:

It is required in order to get the new touch type indirect pointer and EventType.transform. Existing projects do not have this key set and will need to add it. Starting with iOS 14 and macOS Big Sur SDKs, new UIKit and SwiftUI projects will have this value set to "true."

In addition you will use UIPointerInteraction. This tutorial shows you step by step including custom cursors:

https://pspdfkit.com/blog/2020/supporting-pointer-interactions/

Jeshua Lacock
  • 5,730
  • 1
  • 28
  • 58
  • This is iOS 14 so it has already been added my project's Info.plist by default. However, taking a look at the features available when set to true (https://developer.apple.com/documentation/bundleresources/information_property_list/uiapplicationsupportsindirectinputevents), I don't think this solves the original question of coordinate tracking, right? – A is for Ambition Jul 08 '21 at 23:47
  • Why the downvotes? I'm pointing the OP directly to information that answers their question. – Jeshua Lacock Jul 09 '21 at 05:10