We have an application running on a EC2 Windows server that stops working as soon a remote desktop session to the server is closed. The application OCRs the actual screen while looking for different windows, menus and dialog boxes, but without an active session, windows skips rendering those things to save system resources.
This application acts somewhat like selenium, in that it assumes control of the mouse and keyboard and launches different applications. When an application is launched, it scans the screen for specific text from menus and dialogs to determine if certain windows are open.
When a remote desktop connection is disconnected, the server saves resources by not doing all the UI rendering the application expects. Thus it can't read the screen and stops working properly. Right now I just keep a remote desktop connection open 24/7 from my laptop, but the application stops working every time I close my laptop screen.
Another thing I tried was logging into an alternate Windows Server and from that server opening a Remote Desktop session into the problem server. I leave that session open, however when I disconnect from the alternate server, somehow the UI rendering issue cascades through the alternate server into the problem server.
I'm wondering if there's any way to force Windows Server to continue doing all the UI rendering as if a user was still connected, even when there are no active sessions.
Maybe another clever solution?