0

In my application when I receive an ON_WM_RBUTTONDOWN() message in a certain window I create a CMenu, populated with some items, and then displayed with TrackPopupMenu(xxx). It has no other interaction with Windows messages to be created. It defaults to accepting left clicks to select items and I can see these messages coming in when I use the mouse.

I'm trying to allow use of this context menu on a touch screen - the parent window can receive WM_GESTURENOTIFY messages (for other functionality) and in all other aspects of my app, such as other windows and dialogs, it handles touch gestures fine - Spy++ shows gesture messages and a WM_LBUTTONDOWN which gives me normal behaviour across the app. I CAN touch select menu items when this menu is opened with a physical mouse right click, with the touch input coming through as a WM_LBUTTONDOWN.

I've tried creating and displaying this menu by calling that same creation code again from a touch message, or just sending the window an ON_WM_RBUTTONDOWN() message manually after a touch, with the same flags. This is created fine and works as normal with a mouse, and as far as the app is concerned nothing is different. However, the CMenu is not receiving any touch messages at all - I get the touch-style cursor showing where I'm tapping, but nothing is being piped through to the menu.

I've tried changing from gestures to registering touch while this interaction happens and ensuring the original gesture handle is closed in case it was blocking for whatever reason.

My assumption is that Windows is doing something additional behind the scenes beyond what my app is aware of to allow these messages to be sent through, so I'm a bit stuck for a solution.

Charlie Hermans
  • 128
  • 2
  • 10
  • 1
    Did you call RegisterTouchWindow? ON_WM_RBUTTONDOWN is not a message. It is a macro that maps WM_RBUTTONDOWN message from the windows procedure to the MFC message map. – JohnCz Mar 28 '22 at 13:51
  • @JohnCz I've tried calling RegisterTouchWindow on the parent Window but still no joy. The app itself doesn't call that as I want to receive Gesture messages instead, which works in all other places with SetGestureConfig. You're right about the macro, mistake in the post - correction would be that I do send a WM_RBUTTONDOWN message, tried through SendMessage and DispatchMessage. Displays the popup menu fine and works with mouse input, but not touch. I'd at least expect consistent behaviour where Windows sends a left click on the touch. – Charlie Hermans Mar 28 '22 at 14:14
  • 1
    OK, What kind of application is it? MDI/SDI? Dialog based? Doc/View architecture? What window you are trying to process touch screen interaction? Are you or any base class handling PreTranslateMessage? It is very hard to just guess what is going on without actually debugging the code. I have a suggestion though: Try to handle WM_CONTEXTMENU message instead of WM_RBUTTONDOWN. The best if you could create a test app that duplicates this problem and share the code. – JohnCz Mar 29 '22 at 19:45
  • I managed to find a solution and added as an answer, but for the sake of completion to answer yours - It's a bastardised MDI which doesn't really confirm to a single architecture (it's quite an old legacy codebase). The touch interaction is being handled by one of the windows, and PreTranslateMessage is being handled further up the stack. I did end up using a Visual Studio generated MFC app to replicate the issue but the given what the solution ended up being it seems like I was just going about this the wrong way. Thanks for looking at this though. – Charlie Hermans Apr 14 '22 at 13:12

1 Answers1

0

I was able to get around this issue by enabling the tablet press-and-hold gesture (it's normally disabled) which serves the purpose of being treated as a right click and having a properly interactable context menu, rather than sending the right click message myself. Works on desktop with a touch screen and a Windows tablet.

https://learn.microsoft.com/en-us/troubleshoot/developer/visualstudio/cpp/language-compilers/mfc-enable-tablet-press-and-hold-gesture

Adding ULONG CMyView::GetGestureStatus(CPoint /*ptTouch*/) { return 0; } was what enabled this to work.

Charlie Hermans
  • 128
  • 2
  • 10