1

I need to select the monitor my game is displayed on from the command line for dual monitor systems. If I use the Unity Screen Selector Dialog it gives me the choice of which monitor I want to display the game on when starting the game, and it works fine. When I try loading the game from the command line with the command line argument: "MyGame -adapter 1" or "MyGame -adapter 2" it seems to ignore the argument, and just loads the game on the same monitor every time.

Notes: I have a dual monitor system, but only one video card (GeForce GT 740). I am using Unity 5.6.1f1 (64-bit), and Window 10.

Please let me know what I am missing.

JITRock
  • 13
  • 4
  • The first thing to do is to make sure that Unity actually detects 2 monitors by running `Debug.Log( Display.displays.Length);` – Programmer Jun 28 '17 at 01:47
  • Debug.Log( Display.displays.Length); from within the game returns 1. But when I launch the game outside of Unity the default game launcher gives me a choice of which monitor I want to run the game on. I can select either monitor from the game launcher, and the game will run on that monitor. I have created my own game launcher, and would like to use Unity's Standalone Player built in command-line arguments when calling the game from my launcher. – JITRock Jun 28 '17 at 02:33
  • Are you building as UWP or Standalone? – Programmer Jun 28 '17 at 03:37
  • Standalone, and I am using WPF for the launcher. – JITRock Jun 29 '17 at 00:29

1 Answers1

0

This is a known bug and it doesn't seem to be fixed at this time. The -adapter argument does not work with any Direct3D above 9.

Go to File --> Build Settings, select your PC Windows platform then go to Other Settings.

From here, disable the Auto Graphics API for Windows checkbox.

You will be given option to chose which Direct3D to use. Make to remove all other Direct3D version and simply just put Direct3D9 there. The 9th version should until Unity fixes this bug.

enter image description here


If that does not work:

Go to File --> Build Settings, select your PC Windows platform then go to Resolution and Presentation.

Disable "Default Is Full Screen" and "Display Resolution Dialog".

enter image description here


Again, if that fails, use the Display API to do through code. You can find more information about that here. I also suggest that you file for a bug report too.

Programmer
  • 121,791
  • 22
  • 236
  • 328
  • It looks like I will be using code. The other 2 options did not work. I will definitely file a bug report. Thank you very much! – JITRock Jun 29 '17 at 01:47
  • Interesting. Go ahead and file for a bug request. Don't forget to [accept](https://meta.stackexchange.com/a/5235) when you get the one working. – Programmer Jun 29 '17 at 02:07
  • Also don't forget to lunch it with the `-multidisplay` argument if you are going to run it from code. – Programmer Jun 29 '17 at 02:14
  • I had some time to think this through, and changing monitors at the scene level is not how I would like to do this. The fact that this is a known bug is very helpful info though. Please let me know if you know anything about how the default standalone player game launcher works. If I knew how it worked maybe I could just do the same thing. Thanks for all the help, and the extra knowledge! – JITRock Jun 30 '17 at 02:09
  • *"changing monitors at the scene level is not how I would like to do this"* It's just a work-around at this time. If you file for a bug there could be a fix soon. I do think you can program it and let people choose monitors when the game is loaded. I actually prefer this method since it gives you more control. – Programmer Jun 30 '17 at 03:51
  • The bug is filed. I like the idea of being able to select the monitor in game as well. I would like to implement both options. I will look into the in game monitor change more as well. Thanks. – JITRock Jun 30 '17 at 20:58