5

I am using Unity3D for rendering a scene and my machine has multiple GPUs. Could someone let me know how I can select a GPU for rendering? For example, I want to use the second GPU for rendering. Is there anyway to specify a GPU?

airesearch
  • 51
  • 1
  • 1
  • 2
  • You want to display different stuff on different monitors? – Programmer Jul 07 '16 at 22:34
  • Actually, I have a fast and a slow GPU. I want to select the fast GPU for rendering. – airesearch Jul 07 '16 at 22:38
  • Whilst I don't know the answer to this - I am curious as to why you'd have two of different speeds? – Muhwu Jul 08 '16 at 00:37
  • @Muhwu Some computers comes with two GPU's to preserve battery life. The low power GPU is used for just browsing websites,typing documents and playing videos. The other one is automatically enabled by the OS when using CPU/GPU intensive apps such as Maya and photoshop or when playing games. Probably that or he added another more powerful graphics card to his computer. – Programmer Jul 08 '16 at 00:57
  • Do you want to do this for the Editor or game build? – Programmer Jul 08 '16 at 01:31

6 Answers6

3

There is a Unity command line option -gpu # which lets you select which GPU Unity apps run with

UnityApp.exe -gpu 1

editor.exe -gpu 0

I haven't seen this command line argument documented but it let's me test on my integrated and dedicated GPU.

TWhittaker
  • 31
  • 2
2

You can change opening settings of a program, such as,compatibility mode, or graphics performance. Every graphics cards have a manager installed on the computer and you can change/add your application to it. A guide for NVIDIA, here is a link:

Link to guide

Try to apply this guide to Unity3d.

Ramazan Kürkan
  • 156
  • 1
  • 10
  • Is there anyway to do that via command line so we can call it from our code? – airesearch Jul 08 '16 at 16:53
  • I doubt that it's available. You see it's not a standard. All I know you have some cli control under Linux, but under windows, I know nothing. However, my knowledge is a little bit outdated. – Ramazan Kürkan Jul 08 '16 at 20:50
0

Almost the same situation that I have is onboard GPU0 and an external GPU1, Unity was using the onboard GPU0 now its using the GPU1. In Win10 please set your GPU1 as the 'Main Display' in your display settings. Next to that the option within NVIDIA 'Control Panel' to run a X program with the GPU1 (in my case Unity) which did not work while my 'Main Display' was set at GPU0. In my case connect the monitor to your GPU1 and make it your 'Main Display' and connect the your second monitor to your GPU0 and use that for other purposes than Unity. Which in a Mickisoft way makes prefect sense...

Hope it helps, M8o

Meto
  • 1
0

Had the same problem. Running a GeForce GTX 960M. In the NVIDIA control panel under Manage 3D settings I customized Unity to run only on gpu, though this still did not fix the issue. In the Editor.log file in Unity it still showed that it did not pick up the gpu. Tried to manually specify in command prompt and still it did not work (see https://docs.unity3d.com/Manual/GPUProgressiveLightmapper.html). Finally I updated my driver (there was a new version out) but I did a clean install. Afterwards I loaded unity and it detected the gpu and I could use Progressive gpu in unity for the baking.

Johnny M
  • 11
  • 1
0

I solved by installing a program from AMD website that manages drivers and performance. IN my case, I've got an AMD Radeon HD 8670M, which is kinda old, so I guess the normal "upgrade driver" button we usually select in Device Manager wasn't enough. After that, Unity and also SketchUp, started recognizing the dedicated GPU and using it instead of the integrated Intel HD.

laf3rs
  • 1
0

The documented option is '-force-device-index' when starting the Unity Player (when using DirectX). See https://docs.unity3d.com/Manual/PlayerCommandLineArguments.html

Ruud van Gaal
  • 133
  • 2
  • 8