VOGONS


First post, by chinny22

User metadata
Rank l33t++
Rank
l33t++

I can't believe I've never done a post on what would easily be my most used PC in my retro fleet, maybe its due to I've been slowly replacing parts over the years.

It started life with old hardware I had spare, mainly a Asus P5KPL-AM motherboard and a Dual core E5200.
The CPU was replaced with a Xeon x3320 pulled from a dead server at work. This has the exact same specs as the E5200 so doubt any upgrade worth but Xeon on the post screen looks cooler!

Then started looking into SLI, more just for fun then actual performance gains and learnt that cards with dual GPU's were a thing! Found someone on gumtree selling GeForce GTX590, I thought it was just 1 but he had a pair!
I was planning on buying a 2nd later on but here was an opportunity to buy 2 perfectly matched cards in 1 hit. That cost me £360 back in 2014

It wasn't till 2018 that I actually got a SLI motherboard. Someone was clearing a bunch of NOS Asus P5N-D motherboards for £45.00
As the old generic case would not be able to handle the SLI setup I went on a spending spree and upgraded,

CPU heatsink: Intel OEM to a Noctua NH-U14S (I'm still impressed with their free support with backwards compatible mounting kits)
RAM: 4GB of Crucial Ballistix Tracer 800MHz (with sexy red flashing LEDS)
Sound: SB X-Fi Titanium Fatal1ty Champion, Wanted the Flagship X-Fi soundcard for my "ultimate XP build"
HDD: Pair of 500GB Spinning rust in a stripe array.
and a Gotek and DVD drive.
All put into a Corsair Obsidian 650D case

P5-N-Front.jpg

P5-N-Side1.jpg

My most demanding games aren't that demanding. GTA SA and Farming Simulator 2013 which this system can run with ease, as well as majority of my other games.
It's only really dos and a few 9x titles that I need to use anther PC.
I also find XP is old enough to network well with WFW yet new enough for Win11. The system is also fast enough that I'll typically use this to burn CD's or unzip programs and then copy across the network then use the P3 to unzip.

Recently I upgraded to a pair of 1TB drives that I've put in a mirror. Slight performance hit but I'm thinking safeguards me somewhat with my games that required activation, eg FS2013.

Haet was always going to be an issue and Speedfan reports temps of around 85c when playing games which is why I've added a few extra fans just t help out.

Reply 1 of 17, by nd22

User metadata
Rank Oldbie
Rank
Oldbie

Very nice build! What CPU are you using right now? You got a video card - actually 2 - more powerful than my newest card - GTX 560 TI!
I also got that Corsair RM750i - incredible stable with new and old hardware alike - read 5V only socket A boards!

Reply 2 of 17, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++

Regarding the heat issues, a single GTX 980 Ti would be cooler, quieter and far less power hungry, while delivering performance comparable to that SLI setup. They came down in price a fair bit, and modding the drivers to support one of those under WinXP is very easy.

And yeah, the X-Fi Titanium cards are great for WinXP gaming. Rare to see one complete with the front panel nowadays.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 980Ti / X-Fi Titanium

Reply 3 of 17, by chinny22

User metadata
Rank l33t++
Rank
l33t++
nd22 wrote on 2025-07-04, 05:18:

Very nice build! What CPU are you using right now? You got a video card - actually 2 - more powerful than my newest card - GTX 560 TI!
I also got that Corsair RM750i - incredible stable with new and old hardware alike - read 5V only socket A boards!

That same Xeon x3320, I could upgrade cheaply now but it has somewhat sentimental value as the customer who's sever I salvaged this from has long gone out of business.
Funny as the HP DL120 G6 server it came from was a cheap horrible server. but admit was suited to it's task as a server for a POS system.

PSU's are nice but must admit I don't really use the software as it's not supported in XP. This does duel boot into Win7 but cant even remember I booted into.
Still it takes up a USB header on the motherboard that would otherwise be empty 😀

Joseph_Joestar wrote on 2025-07-04, 05:45:

Regarding the heat issues, a single GTX 980 Ti would be cooler, quieter and far less power hungry, while delivering performance comparable to that SLI setup...
And yeah, the X-Fi Titanium cards are great for WinXP gaming. Rare to see one complete with the front panel nowadays.

I don't even need that much power, My Win7 box has a GTX 780 which plays my most demanding games and of course still has official WinXP support.
But thats boring, Only reason I have it in that build is it has 2 CPU's.
Ideally this rig would be a dual socket build as well but even I cant justify the cost of a dual socket SLI capable board.

I do like the Create front Panels and have both the internal SB0250A and external SB0360 variants for the Audigy 2.

Reply 4 of 17, by H3nrik V!

User metadata
Rank l33t
Rank
l33t
Joseph_Joestar wrote on 2025-07-04, 05:45:

Regarding the heat issues, a single GTX 980 Ti would be cooler, quieter and far less power hungry, while delivering performance comparable to that SLI setup. They came down in price a fair bit, and modding the drivers to support one of those under WinXP is very easy.

But it would be nowhere near as cool (well not literally, obviously) as dual 590s ... If I get it right, a 590 is already an SLI'ed card, so this is a quad SLI setup ... Sooooooo sexy!

If it's dual it's kind of cool ... 😎

--- GA586DX --- P2B-DS --- BP6 ---

Please use the "quote" option if asking questions to what I write - it will really up the chances of me noticing 😀

Reply 5 of 17, by H3nrik V!

User metadata
Rank l33t
Rank
l33t

I assume you settle for 4 GiB since you only run 32-bit OSes on it?

Really nice build!

If it's dual it's kind of cool ... 😎

--- GA586DX --- P2B-DS --- BP6 ---

Please use the "quote" option if asking questions to what I write - it will really up the chances of me noticing 😀

Reply 6 of 17, by AlexZ

User metadata
Rank Oldbie
Rank
Oldbie

You may want to take a look at Re: Any love for AM2? where I'm testing AM2(+) CPUs which are from the same era. Archer57 posted results for his E8600.

The learning is quad cores from that era are not that great for Windows Vista coverage because the clocks are unnecessarily low and Vista era (2007-2009) games get CPU bottlenecked. Windows XP era (2002-2006) is covered comfortably. I have many more powerful GPUs to test but want to try better CPUs first. Your Xeon x3320 2.5Ghz has very similar clock to my 2.6Ghz Phenom, they are from the same era, have the same purpose ( Re: Any love for AM2? ).

It doesn't matter you have SLI, games will be CPU bottlenecked. It would be great if you could run 3d mark 2006 using the same resolutions I did and perhaps Crysis dx9 benchmark at 1600x1200 (or 1920x1024, almost the same area) at maximum settings (I used all max + full screen anti aliasing to 4x).

I have GeForce GTX 580, GTX 770 (proxy for GTX 680, I don't have it), GTX 780, GTX 980 coming later.

Pentium III 900E, ECS P6BXT-A+, 384MB RAM, GeForce FX 5600 128MB, Voodoo 2 12MB, Yamaha SM718 ISA
Athlon 64 3400+, Gigabyte GA-K8NE, 2GB RAM, GeForce GTX 260 896MB, Sound Blaster Audigy 2 ZS
Phenom II X6 1100, Asus 990FX, 32GB RAM, GeForce GTX 980 Ti

Reply 7 of 17, by UWM8

User metadata
Rank Newbie
Rank
Newbie
Joseph_Joestar wrote on 2025-07-04, 05:45:

Regarding the heat issues, a single GTX 980 Ti would be cooler, quieter and far less power hungry, while delivering performance comparable to that SLI setup. They came down in price a fair bit, and modding the drivers to support one of those under WinXP is very easy.

And yeah, the X-Fi Titanium cards are great for WinXP gaming. Rare to see one complete with the front panel nowadays.

A GTX 980 Ti should shows all its potential on a pci express 3.0 motherboard; LGA 1155 and Ivy Bridge would be the most suitable choice!
Technically speaking, a pci express 3.0 gpu on a pci express 2.0 motherboard, should lose 10% of performance.
In addiction to LGA 1155, considering LGA 2011 and some particular FM2+ motherboards, would be an option to get the ultimate windows xp machine.

UltraWide PC gamer running old and fine games at high refresh rate https://www.youtube.com/@UltrawideM8
My main PC:
AMD Ryzen 5 2600x
NVIDIA GTX 1660 Super
4x8gb DDR4 G.Skill Aegis 3200 mhz (downclocked to 2800 mhz)
ASUS TUF B450-PLUS GAMING

Reply 8 of 17, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie
UWM8 wrote on 2025-07-07, 10:27:

A GTX 980 Ti should shows all its potential on a pci express 3.0 motherboard; LGA 1155 and Ivy Bridge would be the most suitable choice!
Technically speaking, a pci express 3.0 gpu on a pci express 2.0 motherboard, should lose 10% of performance.

To be fair, GPUs aren't meant to saturate PCIe bandwidth.
They do it only in case they run out of VRAM (at which point, even latest 5.0 standard will struggle to keep up), and only IF storage they use on the other end (RAM with IMC/chipset in the middle) - are able to keep up with it.

That's why there isn't any dramatic performance loss on RTX 5090 in games, even with PCIe 3.0 x16. Everyone avoids PCIe trasfering data, to prevent any slowdowns during gameplay.

For this case, too slow CPU/platform combination will have the biggest contribution in limiting GPU performance (mainly due to software used).

Reply 9 of 17, by UWM8

User metadata
Rank Newbie
Rank
Newbie
agent_x007 wrote on 2025-07-07, 12:41:
To be fair, GPUs aren't meant to saturate PCIe bandwidth. They do it only in case they run out of VRAM (at which point, even la […]
Show full quote
UWM8 wrote on 2025-07-07, 10:27:

A GTX 980 Ti should shows all its potential on a pci express 3.0 motherboard; LGA 1155 and Ivy Bridge would be the most suitable choice!
Technically speaking, a pci express 3.0 gpu on a pci express 2.0 motherboard, should lose 10% of performance.

To be fair, GPUs aren't meant to saturate PCIe bandwidth.
They do it only in case they run out of VRAM (at which point, even latest 5.0 standard will struggle to keep up), and only IF storage they use on the other end (RAM with IMC/chipset in the middle) - are able to keep up with it.

That's why there isn't any dramatic performance loss on RTX 5090 in games, even with PCIe 3.0 x16. Everyone avoids PCIe trasfering data, to prevent any slowdowns during gameplay.

For this case, too slow CPU/platform combination will have the biggest contribution in limiting GPU performance (mainly due to software used).

I beg up your pardon but i disagree with you: Many times i've saturated pci express bandwidth by requesting high frame amount or not so heavy duty, and without fully using VRAM;
try to get hwinfo and see the proper indicator (PCIe Link Speed), you will get saturated more easily than you think, especially if you high refresh rate monitor.
In case of running out of VRAM, modern oses share half part of RAM, and add it to the VRAM, not the best option for performance but the cheaper way to mitigate VRAM request.
Surely, we should consider CPU's IPC and core amount, but obviously it depends on which softare is running on.

UltraWide PC gamer running old and fine games at high refresh rate https://www.youtube.com/@UltrawideM8
My main PC:
AMD Ryzen 5 2600x
NVIDIA GTX 1660 Super
4x8gb DDR4 G.Skill Aegis 3200 mhz (downclocked to 2800 mhz)
ASUS TUF B450-PLUS GAMING

Reply 10 of 17, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

And you have the right to not agree with me 😀

Some notes though :
1) Here's minimum framerate numbers for RTX 5090 when running at multiple limited PCIe speeds (done by TPU) : LINK
^I fail to image a GTX 590 being able to saturate PCIe 2.0 when 5090 retains ~80% (Avg.), of it's performance when running on it.

2) Just because you see PCIe link speed change in GPU-z/HWInfo - it doesn't mean PCIe bus get's saturated. That indicator just works in tandem with switch from 2D clocks into 3D ones (in line with "we go 3D, we go full PCIe speed" mentality).
It doesn't "go up" when GPU reaches certain GB/s threshold of PCIe bus bandwidth used (meaning : shows "1.1" if up to 4GB/s is used, and goes into 2.0 mode when transfer requirement go above that).

3) Lastly, PCIe 2.0 has 8GB/s maximum bandwidth in each direction (again, theory).
~8GB/s is pretty taxing on LGA 775 platform. Xeon X3320 has max. bandwidth of ~21GB/s (at 1333 QPB [effective] FSB speed), while memory subsystem for DDR2 800 Dual Channel configuration is ~12.8GB/s.
Those are pure math numbers though, actual limits are a lot lower due to buffers and inefficiencies in changing signals, etc.
Regardless, allocating 8GB/s of bandwidth for GPU bus when it coresponds to almost 63% of overall RAM bandwidth you have for both CPU and GPU data is... possible - but pretty hard, since it assumes CPU can actually work with limited bandwidth to RAM when this maximum is required for GPU.

Just wanted to point out few things 😉

Reply 11 of 17, by AlexZ

User metadata
Rank Oldbie
Rank
Oldbie

8GB/s? Where would you get that data for GPU from? RAM? In the period of 775 lifetime you had 4-8GB RAM. It would take ages to load it from SSD/HDD at the time, which is happening at the start of mission/level.

Do games actually allow you to select detail level beyond the memory capacity of your GPU?

Pentium III 900E, ECS P6BXT-A+, 384MB RAM, GeForce FX 5600 128MB, Voodoo 2 12MB, Yamaha SM718 ISA
Athlon 64 3400+, Gigabyte GA-K8NE, 2GB RAM, GeForce GTX 260 896MB, Sound Blaster Audigy 2 ZS
Phenom II X6 1100, Asus 990FX, 32GB RAM, GeForce GTX 980 Ti

Reply 12 of 17, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

I'm not one that claims :

Many times i've saturated pci express bandwidth by requesting high frame amount or not so heavy duty, and without fully using VRAM;

😁

But I will address other thing :

@AlexZ Since when GPU has time to wait one whole second (1000ms) for data it requires ?
On loading menus it's fine, but games like to not throw a loading screen all the time (due to players being angry about it) 😀
It's all about getting things delivered in ~30ms, or you can forget about getting even cinematic+ 30FPS.
PCIe 2.0 can deliver ~240MB of data in those 30ms, if next frame/scene requires more than 240MB of data swap from VRAM - you are screwed (and get stutter/lag spike as reward).

Catch is : You will only see it only when it happens ALL THE time (constant read from and write into VRAM from external storage medium).
That's when you can talk about PCI-e being saturated.
High FPS situation even more relies on VRAM inside GPU and not PCIe (because time to get data is even smaller - 8ms or less).

Reply 13 of 17, by UWM8

User metadata
Rank Newbie
Rank
Newbie
agent_x007 wrote on 2025-07-08, 08:20:
And you have the right to not agree with me :) […]
Show full quote

And you have the right to not agree with me 😀

Some notes though :
1) Here's minimum framerate numbers for RTX 5090 when running at multiple limited PCIe speeds (done by TPU) : LINK
^I fail to image a GTX 590 being able to saturate PCIe 2.0 when 5090 retains ~80% (Avg.), of it's performance when running on it.

2) Just because you see PCIe link speed change in GPU-z/HWInfo - it doesn't mean PCIe bus get's saturated. That indicator just works in tandem with switch from 2D clocks into 3D ones (in line with "we go 3D, we go full PCIe speed" mentality).
It doesn't "go up" when GPU reaches certain GB/s threshold of PCIe bus bandwidth used (meaning : shows "1.1" if up to 4GB/s is used, and goes into 2.0 mode when transfer requirement go above that).

3) Lastly, PCIe 2.0 has 8GB/s maximum bandwidth in each direction (again, theory).
~8GB/s is pretty taxing on LGA 775 platform. Xeon X3320 has max. bandwidth of ~21GB/s (at 1333 QPB [effective] FSB speed), while memory subsystem for DDR2 800 Dual Channel configuration is ~12.8GB/s.
Those are pure math numbers though, actual limits are a lot lower due to buffers and inefficiencies in changing signals, etc.
Regardless, allocating 8GB/s of bandwidth for GPU bus when it coresponds to almost 63% of overall RAM bandwidth you have for both CPU and GPU data is... possible - but pretty hard, since it assumes CPU can actually work with limited bandwidth to RAM when this maximum is required for GPU.

Just wanted to point out few things 😉

Thank you very much! You enlightened me very delightfully!

UltraWide PC gamer running old and fine games at high refresh rate https://www.youtube.com/@UltrawideM8
My main PC:
AMD Ryzen 5 2600x
NVIDIA GTX 1660 Super
4x8gb DDR4 G.Skill Aegis 3200 mhz (downclocked to 2800 mhz)
ASUS TUF B450-PLUS GAMING

Reply 14 of 17, by chinny22

User metadata
Rank l33t++
Rank
l33t++
H3nrik V! wrote on 2025-07-04, 07:47:

I assume you settle for 4 GiB since you only run 32-bit OSes on it?

Really nice build!

You assume correct, 32bit OS being the main focus no real benefit of more RAM.

H3nrik V! wrote on 2025-07-04, 07:45:
Joseph_Joestar wrote on 2025-07-04, 05:45:

Regarding the heat issues, a single GTX 980 Ti would be cooler, quieter and far less power hungry, while delivering performance comparable to that SLI setup. They came down in price a fair bit, and modding the drivers to support one of those under WinXP is very easy.

But it would be nowhere near as cool (well not literally, obviously) as dual 590s ... If I get it right, a 590 is already an SLI'ed card, so this is a quad SLI setup ... Sooooooo sexy!

Also correct, this has 4 GPU's but WinXP Quad SLI ended with the 7950 GX2 (didn't know that when building the system) 2 way SLI works though.
Only reason I really installed Win7 was too see Quad SLI actually work!
Strange thing is it's totally sexy, yet every time I drop it into a conversation with an attractive girl, they never seem impressed? strange!

AlexZ wrote on 2025-07-07, 09:53:

You may want to take a look at Re: Any love for AM2? where I'm testing AM2(+) CPUs which are from the same era. Archer57 posted results for his E8600.

The learning is quad cores from that era are not that great for Windows Vista coverage because the clocks are unnecessarily low and Vista era (2007-2009) games get CPU bottlenecked. Windows XP era (2002-2006) is covered comfortably. I have many more powerful GPUs to test but want to try better CPUs first. Your Xeon x3320 2.5Ghz has very similar clock to my 2.6Ghz Phenom, they are from the same era, have the same purpose ( Re: Any love for AM2? ).

It doesn't matter you have SLI, games will be CPU bottlenecked. It would be great if you could run 3d mark 2006 using the same resolutions I did and perhaps Crysis dx9 benchmark at 1600x1200 (or 1920x1024, almost the same area) at maximum settings (I used all max + full screen anti aliasing to 4x).

I fully admit this is by no means an optimised build and the CPU is defiantly a major bottleneck, but then I also don't have any demanding games. This whole exercise was more to play with the hardware itself then a powerhouse gaming rig.
It just also happens to play my XP era games fully maxed out.

That said I'm starting to get that itch of wanting to build another system.
I actually own a pair of 7950 GX2's without a home. I'm thinking of using them in this PC so I can get Quad SLI working in XP and possibly building another system around a X58 motherboard for the GTX 590's

Reply 15 of 17, by AlexZ

User metadata
Rank Oldbie
Rank
Oldbie
chinny22 wrote on 2025-07-10, 05:29:

That said I'm starting to get that itch of wanting to build another system.
I actually own a pair of 7950 GX2's without a home. I'm thinking of using them in this PC so I can get Quad SLI working in XP and possibly building another system around a X58 motherboard for the GTX 590's

GTX 590s would be a great fit for X58 with a 3.3-3.4Ghz Core i7.

Pentium III 900E, ECS P6BXT-A+, 384MB RAM, GeForce FX 5600 128MB, Voodoo 2 12MB, Yamaha SM718 ISA
Athlon 64 3400+, Gigabyte GA-K8NE, 2GB RAM, GeForce GTX 260 896MB, Sound Blaster Audigy 2 ZS
Phenom II X6 1100, Asus 990FX, 32GB RAM, GeForce GTX 980 Ti

Reply 16 of 17, by H3nrik V!

User metadata
Rank l33t
Rank
l33t
chinny22 wrote on 2025-07-10, 05:29:
Also correct, this has 4 GPU's but WinXP Quad SLI ended with the 7950 GX2 (didn't know that when building the system) 2 way SLI […]
Show full quote
H3nrik V! wrote on 2025-07-04, 07:45:
Joseph_Joestar wrote on 2025-07-04, 05:45:

Regarding the heat issues, a single GTX 980 Ti would be cooler, quieter and far less power hungry, while delivering performance comparable to that SLI setup. They came down in price a fair bit, and modding the drivers to support one of those under WinXP is very easy.

But it would be nowhere near as cool (well not literally, obviously) as dual 590s ... If I get it right, a 590 is already an SLI'ed card, so this is a quad SLI setup ... Sooooooo sexy!

Also correct, this has 4 GPU's but WinXP Quad SLI ended with the 7950 GX2 (didn't know that when building the system) 2 way SLI works though.
Only reason I really installed Win7 was too see Quad SLI actually work!
Strange thing is it's totally sexy, yet every time I drop it into a conversation with an attractive girl, they never seem impressed? strange!

Seriously? You must be barking up the wrong trees, if they don't get impressed by that?

If it's dual it's kind of cool ... 😎

--- GA586DX --- P2B-DS --- BP6 ---

Please use the "quote" option if asking questions to what I write - it will really up the chances of me noticing 😀

Reply 17 of 17, by chrismeyer6

User metadata
Rank l33t
Rank
l33t

If you are thinking of moving to X58 I'd highly recommend the Xeon X5690. I have two systems here with them and they are fantastic and cheap I got 2 for 50 bucks last year. They OC to 4.0-4.1ghz on basically stock voltage. I built my kids a Bazzite based gaming system recently with that cpu and it's still great.