0

The VGA port on the integrated graphical card broke off the mother board. I bought the replacement VGA card and added it to the system, hoping that the server will just use the new VGA card for its output but this does not seem to be so. How can I get the system to use the new VGA card without being able to see anything?

The new graphical card is EVGA GeForce 210. Server is an older version of this one.

Glorfindel
  • 1,213
  • 4
  • 15
  • 22
Mitar
  • 517
  • 4
  • 18
  • usually this is done from the bios, when it's not automatic. what server is it? – Federico Galli Jun 13 '17 at 09:28
  • Do you have both CPU sockets populated? If not, some of the PCIe slots won't work as the PCIe slots are dedicated to a CPU socket. Consult the mainboard manual to find which slots to use. You also could try to connect to IPMI in a webbrowser and start the remote console to access the BIOS. – Thomas Jun 13 '17 at 10:17

1 Answers1

1

I realized that there is a jumper on the mother board I can change to disable the integrated graphical card. I just had to find the PDF with the layout of the motherboard to see which jumper it is. After disabling the integrated graphical card the new graphical card started having the output signal.

There is a similar setting in BIOS as well, but of course I could not get to it. And the setting was set to prefer the integrated one, this was the problem.

Mitar
  • 517
  • 4
  • 18