VOGONS


First post, by porschemad911

User metadata
Rank Newbie
Rank
Newbie

Just wanted to share my experience with running both AMD and Nvidia cards on my 32-bit Windows XP system. I know the conventional wisdom is that they do not play nicely together, and if switching between them should run DDU to totally wipe the other vendor's drivers from the system. I suspected this may be a myth, so did some testing.

First - the motivation? My WinXP system is working very nicely, using a Sapphire Radeon 7970. Well, working nicely except for the grass rendering bug in Star Wars KOTOR, one of my favourite games. I could not fix this using the gloverride patch with any of the driver versions supported by my 7970. So it looks like i needed to run an Nvidia GPU for that game. I didn't however want to use an Nvidia GPU for all my games due to the lack of GPU scaling support over DisplayPort.

My board (a Gigabyte GA-990FXA-UD7) has 2 x PCIE 2.0 x16 slots, and my 850W Seasonic PSU has 4 x 8-pin PCIE cables, so I grabbed an old GTX 670 and installed it in the second x16 slot. No issues with booting into Windows and installing the Nvidia drivers. No issues after rebooting - the 7970 provided POST display output, desktop display output in WinXP and ran games perfectly fine, just like before installing the Nvidia card and drivers. Device manager showed both cards, the Nvidia and Catalyst control panels identified the cards fine and all seemed to be working well.

To switch between the GPUs, I found the following to work well:

  • Reboot into BIOS and toggle which PCIE slot to init display from
  • Shutdown and switch DisplayPort cable over to the desired card
  • Boot - POST, Windows display output and games all seem to use the desired card just fine

Primarily I'm running things on the 7970, but it is nice to be able to switch to the Nvidia GTX 670 GPU for a game that is problematic with Radeon drivers.

Reply 1 of 4, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

I don't think XP is smart enough for "use this as primary display" to work between different cards (NV + ATI)...
Alternative to BIOS option, is to disable one card in device manager.

Also, DP switch (2 inputs, 1 output), should make things a bit easier.

157143230295.png

Reply 2 of 4, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Windows XP is exactly that smart and 3D application will work with primary display adapter.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 3 of 4, by porschemad911

User metadata
Rank Newbie
Rank
Newbie

I was surprised to find it worked this smoothly, particularly because of the conventional wisdom to DDU everything when switching between Radeon / Geforce cards or risk abysmal performance, crashes, graphical glitches etc galore.

Not too worried about the minor convenience of switching as I will 90% be running on the Radeon GPU and just using the Geforce card when necessary. I'm used to unplugging / re-plugging / swapping systems around anyway (modern gaming box / WinXP gaming box / SFF work machine). The only issue with disabling a card in device manager vs BIOS is I wouldn't get a POST display unless the card is selected for display init in BIOS.

The GTX 670 works well, but now that I know the concept works, I'm going to swap in an EVGA GTX 780 Classified just for fun to get a bit of overclocking going. Not exactly mandatory for running Star Wars KOTOR at 1280 x 1024, but why not? See what Crysis can do while I'm at it ...

Reply 4 of 4, by porschemad911

User metadata
Rank Newbie
Rank
Newbie

After living with this setup for a while, I have concluded that the most trouble-free way to use one GPU is to disable all related devices for the other GPU in Device Manager as well as selecting the desired primary display adapter in BIOS. This has really been working well for me!

I found that if I don't disable the other GPU in Device Manager it sometimes gets confused about which card to use to render the desktop on. Eg it would display the POST on the Geforce if I selected that in BIOS, but entering Windows it would be outputting the desktop to the Radeon card.

I also found that some games that have issues with Nvidia drivers (eg Mass Effect and General Protection Faults), would still crash with the Radeon doing Windows display output, if the Nvidia GPU wasn't disabled in Device Manager. Disabling the Nvidia GPU prevents the driver from loading at all and resolves the crashes.