0

TLDR: I want some way to read HDMI data directly in code, rather than output it to a monitor. Doesn't really matter what language.

Background: A friend of mine was curious about trying to write an AI for a video game we play, so we were discussing how to do it. At first he suggested to use a camera to read it and then feed that to the AI, but I was thinking there'd be less latency if you could just pull the visual data directly from the hdmi cable to use instead of adding another camera to the mix.

There seem to be some similar questions here, but all the one's I've found were unanswered.

After much google, I'm left with an unclear idea if it's even possible. There are some usb data analyzers, so if you could convert HDMI to USB, that's a possiblity. I'm just trying to figure out if

A) What I'm trying to do is possible.

B) What the best way to do it is.

S. Buda
  • 727
  • 7
  • 27
  • 1
    HDMI is a spec.... In any case you'd need to know _how_ to read the data. Latency will always be an issue unless your hardware's clock can keep up. You may need some electrical engineering to pull it off. Conversion will cause latency, and USB itself will as well. If you can write drivers, you may be able to get output from graphics card to be input to another. But that's insane. Also HDMI is proprietary. – Richard Barker May 10 '19 at 01:20
  • I'm not scared of getting my hands messy with drivers, But EE isn't my strongest skill set. Still, this project interests me so I'm just starting to put out feelers for what I'm going to need. Thanks for the input! – S. Buda May 10 '19 at 03:42
  • Honestly your best bet is to just screencapture a computer. But you could in theory read the electronic signals coming out of the HDMI but you'd need to learn how to interpret it on another system, as long as it's fast enough to keep up. – Richard Barker May 11 '19 at 00:39

0 Answers0