0

I made a simple OSG off screen renderer that renders without popping up a window.

osg::ref_ptr<osg::GraphicsContext::Traits> traits = new osg::GraphicsContext::Traits;
traits->x = 0;
traits->y = 0;
traits->width = screenWidth;
traits->height = screenHeight;
if (offScreen) {
    traits->windowDecoration = false;
    traits->doubleBuffer = true;
    traits->pbuffer = true;
} else {
    traits->windowDecoration = true;
    traits->doubleBuffer = true;
    traits->pbuffer = false;
}
traits->sharedContext = 0;
std::cout << "DisplayName : " << traints->displayName() << std::endl;
traits->readDISPLAY();

osg::GraphicsContext* _gc = osg::GraphicsContext::createGraphicsContext(traits.get());

if (!_gc) {
    osg::notify(osg::NOTICE)<< "Failed to create pbuffer, failing back to normal graphics window." << std::endl;
    traits->pbuffer = false;
    _gc = osg::GraphicsContext::createGraphicsContext(traits.get());
}

However, if I ssh to server and run the application, it actually uses client GPU rather than server GPU. There are four GeForce GPUs on the server. I tried to change the DISPLAY to hostname:0.0 but it did not work.

What should I do to make the application use server GPU not client GPU in Linux?

VforVitamin
  • 946
  • 10
  • 12

2 Answers2

1

First a little bit of nomenclauture: The system on which the display is connected is the server in X11. So you got your terminlogy reversed. Then to make use of the GPUs on the remote system for OpenGL rendering, the currently existing Linux driver model requires an X11 server to run (this is about to change with Wayland, but there's still a lot of work to be done, before it can be used). Essentially the driver loaded into the X server, hence you need that.

Of course an X server can not be accessed by any user. An XAuthority token is required (see the xauth manpage). Also if no monitors are connected, you may have to do extra configuration to convince the GPUs driver to not refuse starting. Also you probably want to disable the use of input devices.

Then with an X server running and the user which shall run the OSG program having got a XAuthority token you can run the OSG program. Yes, it is tedious, but ATM we're stuck with that.

datenwolf
  • 159,371
  • 13
  • 185
  • 298
0

I've done some search and for those who wind up in this question, I'll summarize what I find and I'll update specific commands that enables server side off-screen rendering. and Yes, it is definitely possible.

  1. Use VirtualGL to route all the commands back to server.

    VirtualGL is a X11 specific API that capture OpenGL commands execute on the server-side GPU. However, this might change server-side OpenGL behavior so I would not recommend if other users use OpenGL at the same time.

  2. Offscreen rendering using Mesa graphics library.

    Mesa is an open-source implementation of the OpenGL specification - a system for rendering interactive 3D graphics. A variety of device drivers allows Mesa to be used in many different environments ranging from software emulation to complete hardware acceleration for modern GPUs.

    Mesa allows user to create GraphicsContext that resides on the server-side memory and allows off-screen rendering. link. I'll update some codes.

VforVitamin
  • 946
  • 10
  • 12