0

I am using Delphi XE. My laptop has 2 graphic cards (INTEL and the other is NVIDIA).

I want to make my program use the strongest graphic card, which is in my case the NVIDIA.

Some other programs get that automatically as you see below : Program Settings

It looks like the graphic card driver (or windows) decides that this game should use the better graphic card.

So how can I decide the usage of the graphic cards by programming?

Someone
  • 401
  • 3
  • 5
  • 16
  • "strongest" can be a vague measurement. Stronger in which sense? Faster GPU? Also, according to your screenshot, it automatically used some global setting which was apparently pre-selected elsewhere. – Jerry Dodge Dec 31 '15 at 15:05
  • anything other than Integrated graphic card // sorry for my weak english – Someone Dec 31 '15 at 16:31
  • Your English is clear enough. So in that case, you want to detect whether the GPU on-board or not? – Jerry Dodge Dec 31 '15 at 16:35
  • i want to run the program on the best Graphic card available without being specific (if its there then use it .. else run normally).. not looking for someone to do it for me i just have absolutely no idea where to start from – Someone Dec 31 '15 at 17:20
  • This is not a programming question. Your issues relate to your computer and how you use it. The user of your program can choose how to use their machine. – David Heffernan Dec 31 '15 at 17:20
  • its true that user can change that but that up to the Nvidia/AMD panel if user didn't interfere .. some programs are chosen automatically at first not by users and i want mine to do that too.. i've played video games a lot to notice this. – Someone Dec 31 '15 at 17:41
  • here another screen shot show how nvidia panel decided that SIDEBAR.EXE don't need to run using nvidia , as you see it say " AUTO'' http://s19.postimg.org/4d2m1i5ir/2015_12_31_204459.png and this how it should look like if user chosen his card http://s19.postimg.org/vq7sw9c37/image.png – Someone Dec 31 '15 at 17:53
  • SysInternals Process Monitor - > change settings in NVidia Panel -> search in log where they are stored -> hijack them -> be found and blacklisted – Arioch 'The Dec 31 '15 at 20:29
  • About all you can really do is enumerate the available monitors, use some lower-level system information to figure out which graphic card is used for which monitor, then decide which one you want and display your UI(s) only on that monitor. Most higher-end games use OpenGL or DirectX for their UIs, which provide much more control than GDI/Win32 offers. – Remy Lebeau Dec 31 '15 at 21:17
  • @RemyLebeau you forget opting-in laptop systems when the same monitor can be fed by both integrated Graphics and a discrete one. The discrete one can be just turned off if the monitoring like NVidia panel would not see the reason to power it on – Arioch 'The Dec 31 '15 at 22:00
  • 1
    I have never heard of a single monitor being feed by multiple graphics adapters. – Remy Lebeau Dec 31 '15 at 22:20
  • many modern laptops have two graphics intel and AMD/Nvidia there are options in the nvidia control panel to choose from. as you can see when right click on a shortcut http://s19.postimg.org/aeiohjlo3/rrrr.png – Someone Dec 31 '15 at 22:38
  • I think the question asker is talking about switchable/optimus graphics technology, found primarily in laptops. Basically the laptop runs everything using the integrated graphics and then switches to the dedicated gpu for gaming, video rendering etc to save battery and keeps heat down. I suspect the asker wants to force the dedicated gpu and not use the integrated one. – Craig Dec 31 '15 at 23:32
  • The screenshot in the question shows how to do it – David Heffernan Jan 01 '16 at 09:28
  • 1
    @David Heffernan you must so special. – Someone Jan 01 '16 at 11:24
  • There's no need for that. I can certainly help you ask a better question. – David Heffernan Jan 01 '16 at 11:34
  • @DavidHeffernan I think he wants to do this programmatically, I don't know what project he is working on but I think he is doing graphics processing of some sort and the integrated graphics card is been used rather than the dedicated one but really more clear information is needed here, bad manners is certainly not going to help his cause either. – Craig Jan 01 '16 at 13:00
  • @ Craig i am running a game and the original game already use Nvidia by default and totally automatic buuuut when i runPE the game it run using Integrated graphic .. @ Craig if you visit my profile you might notice what DavidHeffernan been doing !! – Someone Jan 01 '16 at 14:13
  • Your question needs to be improved. If you let us we can help you improve it. Perhaps ask it at the correct site. Then you'll get help. – David Heffernan Jan 01 '16 at 14:50
  • its the correct site and section to ask since i am looking forward to do it programmatically, thanks bot ill just tell user to do it manually until i figure the answer out. such RAM much love. – Someone Jan 01 '16 at 15:07
  • I guess you don't want to fix the question then. I don't know why you reject help. Instead of moaning about down votes you should ask yourself what's wrong with your posts. And try to do better. If you ask good questions you'll get good answers. – David Heffernan Jan 01 '16 at 15:32
  • So if this really is a programmatic question you need to make it clear what you are asking about? You want something to work for this specific graphics setup? Or for all possible cards? Lots more information is found in your comments than is present in the question. Questions should stand alone. There's lots that could be done to improve this one. And then you'd get help. It's almost as if your proud stubbornness is stopping you from asking a question that would yield the help you need. – David Heffernan Jan 01 '16 at 15:36
  • On some hybrid systems Intel\NVidia cards you can force the use to the NVidia GPU in your application you need to export a constant `NvOptimusEnablement` with a value of 1. If you don't know how to export a constant value from a executable, please ask and I can help – Jasper Schellingerhout Jan 05 '16 at 21:24

0 Answers0