5

I have a task like, need to simulate touch/swipe event at the given co-ordination over other apps(On any screen of the phone) at the same time without affecting user interaction(Like both Programmatic simulation and User touch input needs to happen parallelly). It's ok to be a rooted phone.

I have tried the below things with the help of "SYSTEM_ALERT_WINDOW" and "Draw over the app" but can't achieve it.

  1. Via AccessibilityService -> dispatchGesture. But AccessibilityService is synchronized, it can able to process any one of the events (Programmatic or User) at a time Ref:https://github.com/aosp-mirror/platform_frameworks_base/blob/master/core/java/android/accessibilityservice/AccessibilityService.java#L1017

  2. Instrumentation.sendPointerSync -> But it required INJECT_EVENT permission to do touch on other apps, which is available only for Platform signature apps.

To overcome this permission issue changed the INJECT_EVENT permission state to "instant" and flashed custom ROM. Still can't able to process Programmatic and User Touch event at a time

Ref:http://aosp.opersys.com/xref/android-11.0.0_r33/xref/frameworks/base/core/res/AndroidManifest.xml#3189

  1. On rooted phone tried to execute "/system/bin/input touchscreen swipe fromx,fromy,tox,toy,count" Is nothing but "adb shell input touchscreen swipe" Still can't able to process Programmatic and User Touch event at a time

Is there any way to simulate touch via programmatic without affecting user interaction? Is it possible?

Thanks in advance.

Suresh
  • 701
  • 1
  • 6
  • 20

2 Answers2

0

If the device is Rooted you can scan "/dev/input/eventXX" searching for touchscreen device and then write Input Events to this "file". It's not easy (especially recognize which is the right file used by touchscreen) but it works.

emandt
  • 2,547
  • 2
  • 16
  • 20
  • Hi, @emandt I have tried that too. Like "adb shell getevent "recorde swipe and convert that hexadecimal to normal values and again pass it via adb shell sendevent That has also stopped while user interaction and resuming after user touch complete. Its not executing parallelly. – Suresh Mar 18 '21 at 15:01
  • I should admit that I never tried to send fake/virtual events to touchscreen WHILE it's physical in use by the user...maybe the Kernel gives priority to Hardware Events instead of use both in the same time. I don't think there is any other method (lower-code) to bypass this Kernel behaviour. Sorry – emandt Mar 18 '21 at 15:17
0

You can use Runtime.getRuntime().exec("input touchscreen swipe X1 Y1 X2 Y2").waitFor() where (X1, Y1; X2, Y2) are start and stop coordinates of your swipe.