0

I'm working on an app that includes the screen shown below. The panel with the list view is instantiated from a Nib, but the pale grey panel with the drawing in it is a dynamically generated UIView, which is a subview of a UIView subclass called FrameView (for the purposes of the question).

The red dot in the corner is a delete button for that drawing. The drawing is the content of a Drawing Object, which has a many-to-many relationship to the item selected in the list. When I select an item in the list, zero or more such panels, showing the drawings for that item are added as subviews of FrameView.

In order for those delete buttons to be clickable, FrameView has user interaction enabled. This happens when I select an item in the list. It's off when FrameView first appears.

At the bottom left is the key navigation button. it has a variety of gestures and clicks associated with it, that allow the user to move between different editors, that use the main screen. This button has a relatively high zPosition, in the main view.

But once FrameView has its user interaction turned on, it stops clicks and gestures from reaching the navigation button.

I would have thought that increasing the zPosition of the navigation button above FrameView would solve the problem, but it doesn't. How can I make the navigation button receive taps and gestures, even when FrameView has user interaction enabled? Or am I going about this the wrong way?

EDIT: meant to mention the navigation button is the only element added via Storyboard, in case that matters

enter image description here

EDIT 2: After some messing around, I'm overriding the hitTest, so:

    override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
        let view = super.hitTest(point, with: event)
        return view == self ? nil : view
    }

This wasn't sourced from SO, and although there were some answers here that vaguely suggested this approach, they were (as is common on SO) a) associated with obsolete versions of Swift, b) buried in a different context and c) not returned by any obvious searches.

This site has got to do something about obsolete, heavily upvoted answers. I think that Swift has got to be the worst case for this, since there really are so few users of the older versions, thanks to Apple's forced-upgrade policies.

Thanks to Ptit Xav for sticking your head into my mess.

Dan Donaldson
  • 1,061
  • 1
  • 8
  • 21
  • If you think my question isn't constructive, consider how much less helpful that downvote is, without any context. I have a technical problem related to iOS programming; I can't find a solution, including searching here and elsewhere. without context, I'd have to say this is a case of those people whose reasons for being here are based on the need for questions to match their vision of SO, rather than understanding this as a learning resource for people at other levels than theirs. – Dan Donaldson Nov 30 '21 at 04:35
  • Can you show the view hierarchy and which views have user interaction enabled. May be there is one parent view of navigation button that has user interaction disabled. – Ptit Xav Nov 30 '21 at 11:12
  • Hey, @PtitXav, thanks for the response. See my Edit above. I still find it confusing that I couldn't simply prioritize the click receiver through the zPosition, but I'm not going to stay stuck on this. Time to move forward! – Dan Donaldson Nov 30 '21 at 13:25
  • @DanDonaldson - Does "FrameView" have a clear background, but covers the entire screen? It's not clear what you mean by *"increasing the zPosition of the navigation button"* .. are you inserting "FrameView" below the button in the view hierarchy? – DonMag Nov 30 '21 at 13:26
  • @DonMag - yes, I am, as far as I can determine, it's not that. But otoh, the fix I am using (see question) does work, so it seems to indicate that the view is in the way. Because I was aware of the zPosition of both objects, and thought I had dealt with them both, my question's subtext was, "Is there any other determinant for what get clicks/taps besides zPosition in this case?". As it is, the solution I have works, and I'll return to this at a later date, to figure out what I'm doing wrong with regards to my original solution. – Dan Donaldson Dec 02 '21 at 03:26
  • @DanDonaldson -- ok... you say your button is added via Storyboard... how are you *"increasing the zPosition"*? Are you using `view.insertSubview(_:at:)`? or `view.insertSubview(_:belowSubview:)`? If you set the background color of "FrameView" to `.red` instead of `.clear`, does it cover the button? – DonMag Dec 02 '21 at 13:31
  • Hey, @DonMag, I'm increasing the zPosition using .layer.zPosition. I appreciate the interest you're taking. I do have a fix that works, added to the question, which doesn't require that I manage these issues. I have a tight delivery schedule on another project, so I've been pulled off the original project. I'll be back on it, and I'll return to this then. Thanks again! – Dan Donaldson Dec 03 '21 at 01:24

1 Answers1

1

"I'm increasing the zPosition using .layer.zPosition"

OK, that's the issue.

Changing the .zPosition of a view's layer does NOT change its order in the view hierarchy.

Take a look at this layout, with Two buttons added in Storyboard / IB:

enter image description here

Using this basic code, we'll add a "frameView" subview, with user interaction enabled, so that it covers both buttons.

We'll use .layer.zPosition to make the First button visible, but we CAN'T tap it.

We'll use .bringSubviewToFront() to make the Second button not only visible, but it will also change the view Hierarchy so we CAN tap it.

class HierarchyViewController: UIViewController {
    
    @IBOutlet var firstButton: UIButton!
    @IBOutlet var secondButton: UIButton!

    override func viewDidLoad() {
        super.viewDidLoad()
        
        let frameView = UIView()
        frameView.backgroundColor = .systemBlue
        frameView.translatesAutoresizingMaskIntoConstraints = false
        view.addSubview(frameView)
        let g = view.safeAreaLayoutGuide
        NSLayoutConstraint.activate([
            frameView.topAnchor.constraint(equalTo: g.topAnchor, constant: 20.0),
            frameView.leadingAnchor.constraint(equalTo: g.leadingAnchor, constant: 20.0),
            frameView.trailingAnchor.constraint(equalTo: g.trailingAnchor, constant: -20.0),
            frameView.bottomAnchor.constraint(equalTo: g.bottomAnchor, constant: -20.0),
        ])

        // make sure our added view has user interaction enabled
        frameView.isUserInteractionEnabled = true
        
        // change the zPosition of the first button
        firstButton.layer.zPosition = 999
        
        // change the view Hierarchy for the second button
        view.bringSubviewToFront(secondButton)
    }
    
    @IBAction func firstButtonTap(_ sender: Any) {
        print("First Tapped!")
    }
    
    @IBAction func secondButtonTap(_ sender: Any) {
        print("Second Tapped!")
    }
    
}

Here's how it looks at runtime:

enter image description here

No visual difference, but if we use Debug View Hierarchy we can clearly see that the First button is behind the "frameView" while the Second button is in front of it:

enter image description here

While "frameView" has User Interaction disabled, we can "tap through it" to the First button. But once we enable User Interaction on "frameView", it will now grab the touch and we can't tap the First button. Changing the view Hierarchy with .bringSubviewToFront() resolves the issue. (Note, I used a blue background to make it easy to see "frameView" ... the same applies if it has a clear background).

DonMag
  • 69,424
  • 5
  • 50
  • 86