Is there a way to distinguish whether a button was clicked as in with a mouse or touched using a touchscreen in WPF?
3 Answers
You can subscribe to PreviewMouseDown and PreviewTouchDown.
Page.xaml
<Button PreviewMouseDown="Button_PreviewMouseDown"
PreviewTouchDown="Button_PreviewTouchDown" />
Page.xaml.cs
private void Button_PreviewMouseDown(object sender, MouseButtonEventArgs e)
{
MessageBox.Show("Mouse was used.");
}
private void Button_PreviewTouchDown(object sender, TouchEventArgs e)
{
MessageBox.Show("Touchscreen was used.");
}
I don't believe you'll be able to access the eventargs of either in the actual click event.
If you need to perform work there as opposed to the preview events I would recommend setting an instance variable in the preview events so when you get to the click event you know where you came from.

- 4,808
- 2
- 22
- 48
-
Excellent answer, thank you so much for your input. I especially like your last sentence as that is ultimately what I am trying to do. – John Aug 10 '16 at 21:16
You have to set up an event handler. In the designer, double click on the button and that will set on up for you.
Then in the code behind add what ever code you want.
private void Button_Click(object sender, RoutedEventArgs e)
{
this.Title = "Clicked";
}
You can add Touch events as well TouchDown, TouchUp, etc.

- 271
- 1
- 8
-
But how do you know if the click event was due to a mouse click or touch screen? – John Aug 09 '16 at 19:36
-
I do not believe the click event will fire for a touch, you need to set up a touch even for that. My touch pad is broken today so I can verify but I think you need to set up one of the touch event handlers for that. Also the mouse will pass a MouseEventArgs with the button that was press but the touch pad won't – GreatJobBob Aug 09 '16 at 19:39
Windows 7 and its higher versions have the ability to receive input from multiple touch-sensitive devices. WPF applications can also handle touch input as other input, such as the mouse or keyboard, by raising events when a touch occurs.
WPF exposes two types of events when a touch occurs − touch events and manipulation events. Touch events provide raw data about each finger on a touchscreen and its movement. Manipulation events interpret the input as certain actions. Both types of events are discussed in this section.
WPF enables applications to respond to touch. For example, you can interact with an application by using one or more fingers on a touch-sensitive device, such as a touchscreen This walkthrough creates an application that enables the user to move, resize, or rotate a single object by using touch.
Source MSDN : https://msdn.microsoft.com/en-us/library/ee649090.aspx
Also read this codeproject article - http://www.codeproject.com/Articles/692286/WPF-and-multi-touch

- 4,339
- 8
- 29
- 52