2

Hello I'm building a weak AI (a bot) app on android but I'm fairly new to this.

Context: The app is/will be composed of an UI to start/stop the bot and modify his settings, screen caputure service taking screenshot at fixed intervals (let's say every 5 sec for example), an image recognition module(OpenCv) and a touch simulation service(Instrumentation class, MotionEvent class). Apart from UI obviously, every module should be abble to run in background once the bot is launched.

Question: What is the most efficient, senseful way to make a service that can capture screen at fixed intervals from background?

I looked for MediaProjection API doc and demos then I started to make an IntentService that use MediaProjection to record screen and a Timer + scheduled TimerTask to save bitmaps but i'm a bit lost. Here is what I have so far:

Removed

I feel I'm doing it wrong. Can you help me to figure out how to do this with explanations, advices, snippets, tutos or anything helpful please?

Edit: Indeed this is much more simple to achieve this using Runtime to execute adb shell commands. Plus this allow to make image processing on the computer which is faster

  • "and a touch simulation service(Instrumentation class, MotionEvent class)" -- that's not going to work from an app. From tests driven by a development machine and the Android SDK it can work, but if you are doing that, you may as well automate the screenshots from the development machine as well. With regards to your screenshots, do not use an `IntentService`, as that gets destroyed after `onHandleIntent()` returns. Use a regular `Service`, one where the user can control the lifetime (e.g., foreground service managed by notification actions). – CommonsWare Jun 29 '16 at 11:01
  • @CommonsWare Thanks for your insight, so I should go instead for an compter app and use debug bridge to perform touch and screenshot so I could use test API like MonkeyRunner or Robotium? – MojoOverflow Jun 29 '16 at 11:14
  • Your choices are either that, or to limit yourself to rooted devices. There *might* be a way to get `Instrumentation` to work on those, and if not, there are low-level Linux-y ways of faking user input on rooted devices. I suppose another possibility is to try using the accessibility APIs, though they do not give you as much flexibility. – CommonsWare Jun 29 '16 at 11:23
  • Maybe i should have mentioned it but the app i'm try to monitor doesn't works well with other apps requiring SU even if I cloack them but anyway I will follow the debug bridge lead and gather information about it.Thank you again for these advices, I will update the post with what i found. – MojoOverflow Jun 29 '16 at 13:56

0 Answers0