Scenario: I have written some basic controlling programm for my quadcopter. Basically, I use a notebook with a gamepad connected to it to send commands to my quadcopter, both are connected to each other via radio. This setups works suprisingly well, but:
Question: How can my Win32 application that runs on my notebook receive gamepad state changes even if the window is not in focus? I know that some programs like AutoHotKey and AutoIt provide exactly that, but I couldn't find any documentation, neither in the official XInput documentation nor in the internet.
The problem is, that I sometimes also use ArduPilot's Mission Planner software simutaneously on the same notebook. So everytime I want to look some telemetry data up in Mission Planer, the application window handling the gamepad stuff goes out of focus, and it stops receiving gamepad state changes, which is very annoying, and, to be honest, also kind of dangerous, because I should have a working radio controll at all times.
So, Is there a way to "lock" gamepad XInput to a particular window handle? The (Win32) application is written in C++ with Visual Studio.
EDIT: I use the native XInput API hence I use XInputGetState(). My Message loop is designed to poll XInputGetState() everytime there is no Message to Translate-/DispatchMessage(), and the WinMessages are polled with the non-blocking PeekMessage().