6

I have an AccessibilityService that takes in input from game controllers (ps5 controller, xbox controller, etc.).

I use the onKeyEvent() method to handle button presses and releases, so I can handle those easily. The problem I am facing is how to handle the Joystick movements and top trigger presses, as I am unaware how to handle them through an AccessibilityService.

Normally, I would simply use onGenericMotionEvent() to handle these MotionEvents, but unfortunately AccessibilityService does not provide me with such a method. I have looked at the docs and official codelabs for almost 3 weeks with no luck, if someone could tell me how to handle MotionEvents through an AccessibilityService I would be very relieved.

The MotionEvents I want to handle are these:

AXIS_X, AXIS_Y, AXIS_Y, AXIS_RZ, AXIS_RY, AXIS_RX, AXIS_HAT_X, AXIS_HAT_Y, AXIS_LTRIGGER, AXIS_RTRIGGER, AXIS_BRAKE,AXIS_GAS.

There may be others depending on the controller, but these are the main ones I need to handle input from my controller.

Regards, 0xB01b

0xB01b
  • 318
  • 1
  • 10

4 Answers4

1

The API doesn't support these because AccessibilityServices can only filter KeyEvents, and the buttons as you say do not produce KeyEvents.

Can you explain what you're building? Understanding the impact of adding this api to the lives of people with disabilities would help us prioritize this work along with other stuff we're planning.

Phil Weaver
  • 738
  • 3
  • 7
  • I'm working on a keymapper, just for mapping touch and other gestures using gamepads, keyboards, mice, etc. I think it makes stuff a lot easier for people in general such that they can manipulate their phones easier. Joysticks could be used for sort of mapping cursors onto the screen for people with disabilites, and could be used to sort of replicate a desktop environment with mouse and keyboard. – 0xB01b Feb 05 '21 at 07:43
1

In Accessibility Service, you can only get and filter KeyEvents, the MotionEvents can not be handled inside that Service

One possible way is create a Service implement InputMethodService interface. It similar to create a virtual keyboard. You can handle KeyEvent or MotionEvent in the Service globally. What I mean is you can know whatever the input is in every activity after you enable the InputMethodService (like changing the keyboard input). And then, the Accessibility Service can be used to dispatch a Touch or Swipe.

That combination (InputMethodService + AccessibilityService) enables you to map a joystick to Touch and Swipe globally.

However, to make it act like actual joystick mapper, I think the Accessibility Service can not help you to simulate the MotionEvent as you expected. Maybe only the InjectInputEvent can do exactly what you want. But it comes to an old problem: INJECT_EVENTS permission.

  • So I create another service that extends InputMethodService, and then I use that within my AccessibilityService? And use `InjectInputEvent()` instead of `DispatchGesture()`? – 0xB01b Feb 05 '21 at 07:48
  • No. Two services are independent. In AccessibilityService, the Tap and Swipe actions can be simulated by using the dispatchGesture with the GestureDescription.StrokeDescription. The InputMethodService can be used to handle the input (KeyEvent and MotionEvent) that you take from joystick. And inside the method that you used to handle it, you can call the method to dispatch the gesture in the AccessibilityService. About the InjectInputEvent, I still do not know how to Inject it without root or specific permission. – Cong D. Dang Feb 05 '21 at 08:10
  • One more problem with AccessibilityService that there is no way to create a Path that be dispatched to simulate the release action in the ACTION_UP event. I mean the moveto() and lineto() can be used in ACTION_DOWN to tap and hold with the willContinue value, but then how to release the tap in ACTION_UP? – Cong D. Dang Feb 05 '21 at 08:25
  • For releasing tap I am using `continueStroke()` with a stoke that has no duration, it seems to be a bit buggy but is working for now. Also If I activate my accessibilityservice using the two volume rockers pressed simultaneously, can I programmatically activate the InputMethodService? or does the user have to activate both services seperately? Also, how do I call them method for dispatching gestures from my InputMethodService, I cannot have a static method for that and I'm not sure if I can just make an object of it. – 0xB01b Feb 05 '21 at 08:27
  • The InputMethodService also need to active seperately, but it does not require to active every time you start like the accessibility service. It is similar to a virtual keyboard, so if you change the default input method into this, user does not need to active it again. – Cong D. Dang Feb 05 '21 at 08:31
  • Ah, that is good to hear. Then the user would not get frustrated every time he wants to use the AccessibilityService. The only thing I don't fully understand is how I am to call the `dispatchGesture()` method from the InputMethodService, I cannot put it in a static method and I'm not sure if makign an object of my AccessibilityService would work. – 0xB01b Feb 05 '21 at 08:34
  • You can programmatically show the input method picker to change the input method into your virtual keyboard to start the service. This is not good for an app with 2 times require user permission. But it does work globally with a joystick. The problem is simulating the thumbstick motion with AccessibilityService is limited. – Cong D. Dang Feb 05 '21 at 08:35
  • Normally the object of Intent Service can not be used to call it's static method. But for the AccessibilityService, you can call the public method that declared inside the service. So just create an object of it, and assign the service to the object in onCreate. – Cong D. Dang Feb 05 '21 at 08:39
  • so just `InputService input = this;` in my `OnCreate()` and that would work? In the `OnDestroy()` could I also revert the input method back to the normal one? – 0xB01b Feb 05 '21 at 08:41
  • Yes that should work. I'm not so sure whether it can be reverted into the normal one programmatically. Haha – Cong D. Dang Feb 05 '21 at 08:52
  • Allright, thank you so much for your help. Hopefully I will be able to resolve the bug with this. – 0xB01b Feb 05 '21 at 09:18
  • Goodluck. I'm also doing a project with joystick mapping. But just for personal use and for playing games with joystick unsupported. But it is still not finished because of the limited Accessibility Service. There are some app that can actually can inject a touch event over other apps but I think they didn't use AccessibilityService. – Cong D. Dang Feb 05 '21 at 09:37
  • How this going bro? – Cong D. Dang Feb 11 '21 at 06:17
  • I have not tried the fix yet because unfortunately this week was the one week I had my mid term mock exams for the 11th grade, so I got the fix but couldn't try it out. Tomorrow is the last day of tests and then I have holidays for a week, so then I should be able to try it out. Thanks for still checking in after all this time :) – 0xB01b Feb 11 '21 at 14:43
  • Hey, could you give me a code snippet of how to make this InputMethodService because I'm having trouble finding one and I need to translate that from native SDK into the framework im using :( – 0xB01b Feb 15 '21 at 15:33
1

To create an InputMethodService, you can refer to this official document.

First, add service in the Manifest.xml:

<service android:name=".IMService"
        android:permission="android.permission.BIND_INPUT_METHOD">
        <intent-filter>
            <action android:name="android.view.InputMethod"/>
        </intent-filter>
        <meta-data
            android:name="android.view.im"
            android:resource="@xml/method"/>
    </service>

Next, method.xml is need to create under res/xml/ folder:

    <?xml version="1.0" encoding="utf-8"?>
<input-method xmlns:android="http://schemas.android.com/apk/res/android"
    android:isDefault="false"
    android:settingsActivity=".MainActivity">
    <subtype
        android:icon="@mipmap/ic_launcher"
        android:imeSubtypeLocale="en_US"
        android:imeSubtypeMode="keyboard"
        android:label="@string/app_name"/>
</input-method>

Layout of the keyboard is not needed in this case.

Then, create the class IMService implementing the InputMethodService.

The following code is written in Kotlin (for example):

class IMService: InputMethodService() {
companion object {
    private const val TAG = "IMService"
}

override fun onKeyDown(keyCode: Int, event: KeyEvent): Boolean {
    if(event.action == KeyEvent.ACTION_DOWN) {
        if (event.source and InputDevice.SOURCE_GAMEPAD == InputDevice.SOURCE_GAMEPAD) {
            // process key down event here
            return true   // return true if you want the key event to be filtered
        }
    }
    return super.onKeyDown(keyCode, event)
}

override fun onKeyUp(keyCode: Int, event: KeyEvent): Boolean {
    if(event.action == KeyEvent.ACTION_UP) {
        if (event.source and InputDevice.SOURCE_GAMEPAD == InputDevice.SOURCE_GAMEPAD) {
            // process key up event here
            return true
        }
    }
    return super.onKeyUp(keyCode, event)
}

override fun onGenericMotionEvent(event: MotionEvent): Boolean {
    if(event.action == MotionEvent.ACTION_MOVE) {
        if (event.source and InputDevice.SOURCE_JOYSTICK == InputDevice.SOURCE_JOYSTICK) {
            // process motion event here
            return true
        }
    }
    return super.onGenericMotionEvent(event)
}
}

The service lifecycle starts after you enabled and changed the default input method in Settings/../Language and keyboard/Keyboard list and default.

  • I did and and this worked, I could set it from the keyboard list, I also translated it over to the C# framework I'm using. However, for some reason the methods aren't firing when they are keydown and keyup events from my keyboard when I check them with a breakpoint? – 0xB01b Feb 19 '21 at 14:56
  • I'm not sure about how to convert to Xamarin android. You can check this example for reference: https://github.com/Vaikesh/CustomKeyboard – Cong D. Dang Feb 20 '21 at 13:51
  • I mean for normal android sdk as well, the breakpoints I set in android studio don't get hit when i press the down button on my keyboard, do you get the same problem? Also, thanks for that link – 0xB01b Feb 20 '21 at 15:50
  • Hi Cong. Nope not yet, but luckily it gave me time to work on some other code I needed – 0xB01b Mar 31 '21 at 17:29
  • Hi, onGenericMotionEvent, onKeyDown, and onKeyUp is not being called on any of my keyboard/controller input. I suspect that InputMethodService is only active when it is invoked as an IME input such as in a TextView. I am not sure if this is why though. Do you know why it's not working for me? – poetryrocksalot Jul 09 '21 at 22:56
1

To create Accessibility Service that can combine with Input Method Service, you can try the following snippet:

var tapService: TapService? = null

class TapService : AccessibilityService() {
    companion object {
        private const val TAG = "TapService"
    }

    override fun onAccessibilityEvent(event: AccessibilityEvent?) { }

    override fun onInterrupt() { }

    override fun onServiceConnected() {
        super.onServiceConnected()
        Log.d(TAG, "onServiceConnected")

        tapService = this

        startActivity(Intent(this, MainActivity::class.java).addFlags(Intent.FLAG_ACTIVITY_NEW_TASK))
    }

    override fun onUnbind(intent: Intent?): Boolean {
        Log.d(TAG, "onUnbind")
        tapService = null
        return super.onUnbind(intent)
    }

    override fun onDestroy() {
        Log.d(TAG, "onDestroy")
        tapService = null
        super.onDestroy()
    }

    fun tap(x: Int, y: Int, hold: Boolean) {
        // create path and build the GestureDescription then dispatch it
    }

    fun swipe(fromX: Int, fromY: Int, toX: Int, toY: Int, duration: Long) {
        // similar to tap function
    }
}

when the user starts the TapService in the Accessibility Setting Menu, it calls the function onServiceConnected and assigns the tapService object to the service.

Hence, you can call the method tapService.tap() or tapService.swipe() elsewhere in the Input Method Service to inject the touch action.

However, because of the limitation from Accessibility Service, the simulation of touch action is not as expected.