VOGONS


Reply 20 of 38, by Archer57

User metadata
Rank Member
Rank
Member

Yeah, GTS250 would definitely be a safer and smarter pick. I myself gave this exact advice in another thread.

The reason i got this 8800GT cards is that they were incredibly cheap and are still plentiful. Given that i am fine with them dying, even though i will try to avoid that. I even got a spare 😀 All the cards i've chosen also have decent coolers, no single slot nonsense, so if they do die i can always repurpose the coolers.

That's also the reason for GT and not GTS or GTX, G80 based cards are much more expensive and i did not want to deal with that given they are all defective...

Funnily enough 1GB versions of 9800gtx+ or gts250 were at least twice as expensive and i wanted 1GB since i am messing with SLI and higher resolutions.

Reply 21 of 38, by AlexZ

User metadata
Rank Oldbie
Rank
Oldbie

The primary benchmark for s939 should be 3d mark 2005. That may give different results than 2003 as 2005 is more realistic with CPU load.

Buying multiple cheap 8800 GT was a good choice.

Pentium III 900E, ECS P6BXT-A+, 384MB RAM, GeForce FX 5600 128MB, Voodoo 2 12MB, Yamaha SM718 ISA
Athlon 64 3400+, Gigabyte GA-K8NE, 2GB RAM, GeForce GTX 275 896MB, Sound Blaster Audigy 2 ZS
Phenom II X6 1100, Asus 990FX, 32GB RAM, GeForce GTX 980 Ti

Reply 22 of 38, by Archer57

User metadata
Rank Member
Rank
Member
AlexZ wrote on 2025-08-16, 08:22:

The primary benchmark for s939 should be 3d mark 2005. That may give different results than 2003 as 2005 is more realistic with CPU load.

Buying multiple cheap 8800 GT was a good choice.

I'll also run doom3, crysis, fear and farcry2. This are the games i was able to find from ~appropriate time which have good built-in benchmarks.

So far - this SLI is impressive. Being able to play crysis on high preset, 1920x1080 on 2005 platform with 2007 GPUs is something i did not expect. Benchmark results are not all that great, but actual game runs between 30 and 60 with no significant stutter or anything.

I've also now got this toy to play around with:

The attachment 20250816_192124_D.jpg is no longer available

From scrap pile/untested also, so will have to pull it apart first, then see if it works. But if it does should be fun...

Reply 23 of 38, by AlexZ

User metadata
Rank Oldbie
Rank
Oldbie

On my Athlon 64 3400+ with GeForce GTX 275 I get 25 fps in crysis benchmark in 1600x1200 with high preset and 4x anti-aliasing. While in game, fps is in the 25-40 range with 30 being most common so not really enjoyable. It is one of few games that can utilize the GPU despite being CPU bottlenecked. In other games I see 20-25% GPU utilization, in Crysis 80-100%.

Pentium III 900E, ECS P6BXT-A+, 384MB RAM, GeForce FX 5600 128MB, Voodoo 2 12MB, Yamaha SM718 ISA
Athlon 64 3400+, Gigabyte GA-K8NE, 2GB RAM, GeForce GTX 275 896MB, Sound Blaster Audigy 2 ZS
Phenom II X6 1100, Asus 990FX, 32GB RAM, GeForce GTX 980 Ti

Reply 24 of 38, by Archer57

User metadata
Rank Member
Rank
Member

So, i am now pretty much convinced i do not need SLI for permanent build. Playing around with it is fun, but...

The attachment crysis.PNG is no longer available

Even at 1920x1080 gains are minimal at a cost of worse minimum. Though it may be different in actual game, especially in later parts. But this game is pretty much best case scenario for SLI, so...

I am also probably going to go with pci-e and use single 8800GT. It works noticeably better than HD3870 and i prefer nvidia anyway.

Now before assembling everything to run more benchmarks and fool around with that 3870x2, if it works. The card looks quite cooked and i, again, had to remove a couple of spoons worth of horrible thermal compound (why do people do this?), i hope it is functional enough to at least run some benchmarks...

Reply 25 of 38, by Living

User metadata
Rank Member
Rank
Member

i have several 939 cpus but no motherboard other than a crappy ati based chipset MSI (awful sata and usb speed). All the nvidia ones i meet ended dead

i do recomend the Opteron Toledo, back in Q1 2007 i ran a 165 @ 2.7Ghz with a Zalman CNPS7000C-CU in a DFI Infinity NF4.

DSC02097.jpg

Reply 26 of 38, by AlexZ

User metadata
Rank Oldbie
Rank
Oldbie

So the SLI adventure didn't last long. There is too much overhead with SLI it seems and the benefit only becomes noticeable at really low fps so basically useless. Only worked well in 3d mark 2003. What a commercial scam.

I stopped paying attention to min values while doing AM2 testing as I found them too random.

Athlon 64 X2 4800+ helps in Crisis, but not as much as I had hoped. It may be worth it to test with GTS 250, GTX 560, GTX 750. All of those are super cheap.

Pentium III 900E, ECS P6BXT-A+, 384MB RAM, GeForce FX 5600 128MB, Voodoo 2 12MB, Yamaha SM718 ISA
Athlon 64 3400+, Gigabyte GA-K8NE, 2GB RAM, GeForce GTX 275 896MB, Sound Blaster Audigy 2 ZS
Phenom II X6 1100, Asus 990FX, 32GB RAM, GeForce GTX 980 Ti

Reply 27 of 38, by Archer57

User metadata
Rank Member
Rank
Member
AlexZ wrote on 2025-08-16, 14:13:

So the SLI adventure didn't last long. There is too much overhead with SLI it seems and the benefit only becomes noticeable at really low fps so basically useless. Only worked well in 3d mark 2003. What a commercial scam.

I stopped paying attention to min values while doing AM2 testing as I found them too random.

Athlon 64 X2 4800+ helps in Crisis, but not as much as I had hoped. It may be worth it to test with GTS 250, GTX 560, GTX 750. All of those are super cheap.

To be honest i expected this and never expected to practically use it. It is a fun thing to play around with though, perhaps i'll do that with something like LGA775 later.

I used SLI with a pair of GTX660 back when they were new, on LGA1155 with i7-3770k, which was pretty close to its demise. This config actually lasted for a few years which made me very familiar with downsides of SLI, like increased frametime. That's why i was kind of surprised how well crysis worked.

I actually got slightly better results in fear:

The attachment fear.PNG is no longer available

And much better results in some synthetics like unigine sanctuary:

The attachment sanctuary.PNG is no longer available

But also i'd still say - crysis is playable and enjoyable on this system, one card or two. FPS is not as high as i'd like, but it is fairly consistent and i suspect playing with settings a bit could easily result in very good experience. Dual core probably helps, because i've observed very high total load, like 80-90%.

I am also not sure how this chipset works. Sure it provides x16+x16 1.0 slots, but how fast are links between the bridges themselves and the CPU... something newer, even with x8+x8 2.0, could work better.

Then there is the issue with sound... i want that creative soundcard, SLI makes that impossible....

Last edited by Archer57 on 2025-08-16, 15:12. Edited 1 time in total.

Reply 28 of 38, by Archer57

User metadata
Rank Member
Rank
Member
Living wrote on 2025-08-16, 12:59:

i have several 939 cpus but no motherboard other than a crappy ati based chipset MSI (awful sata and usb speed). All the nvidia ones i meet ended dead

i do recomend the Opteron Toledo, back in Q1 2007 i ran a 165 @ 2.7Ghz with a Zalman CNPS7000C-CU in a DFI Infinity NF4.

Interesting. I know nvidia chipsets are affected by bumpgate, but so far i've had very good luck with them. Mostly on AM2(+).

Also given my experience so far with S939... there may be no perfect boards. May be what's you are using is not so bad - usb/sata issues can be bypassed by separate controllers, especially with pci-e available.

Reply 29 of 38, by Living

User metadata
Rank Member
Rank
Member
Archer57 wrote on 2025-08-16, 15:06:
Living wrote on 2025-08-16, 12:59:

i have several 939 cpus but no motherboard other than a crappy ati based chipset MSI (awful sata and usb speed). All the nvidia ones i meet ended dead

i do recomend the Opteron Toledo, back in Q1 2007 i ran a 165 @ 2.7Ghz with a Zalman CNPS7000C-CU in a DFI Infinity NF4.

Interesting. I know nvidia chipsets are affected by bumpgate, but so far i've had very good luck with them. Mostly on AM2(+).

Also given my experience so far with S939... there may be no perfect boards. May be what's you are using is not so bad - usb/sata issues can be bypassed by separate controllers, especially with pci-e available.

am2 is a completely different history, there are many boards i bought 17 years ago that are still in service in custom pcs for clients. Mostly Nforce 6150 and 7025

939 suffered from the bumpgate in full force and many people jumped from 478, Socket A and 754 directly to Am2, Am3 / 775 due to cost and longevity of the platforms (those platforms were useful until at least 2009 since there was no real need for a dual core and 64bits at the time). PLUS was only almost 2 years in the market until was replaced by AM2. Thats why you dont see many boards.

i tend to categorize 939 along with 423, AM1, FM1 and all the tik tok bullshit from Intel. You know they are gonna be expensive once they are discontinued.

Last edited by Living on 2025-08-16, 15:42. Edited 1 time in total.

Reply 30 of 38, by Archer57

User metadata
Rank Member
Rank
Member
Living wrote on 2025-08-16, 15:30:

am2 is a completely different history, there are many boards i bought 17 years ago that are still in service in custom pcs for clients.

939 suffered from the bumpgate in full force and many people jumped from 478, Socket A and 754 directly to Am2, Am3 / 775 due to cost and longevity of the platforms (those platforms were useful until at least 2009 since there was no real need for a dual core at the time). Thats why you dont see many boards.

939 was weird. I remember considering an upgrade from S462 back then. Early single core CPUs made very little sense. They were not all that faster than AthlonXP and all the new stuff they offered like x64 etc was still completely useless. Then x2 came out and they were ridiculously expensive. I wanted one, but $1000 was definitely too much. Then AM2 happened with cheap dual cores... so yeah, i went from S462 to AM2 directly.

I was under the impression that early AM2 stuff was just as affected by bumpgate as late S939, it was not fixed for a few years... perhaps there are different factors at play here, like cooling. The board i have absolutely roasts the chipset unless measures are taken to cool it with a fan.

Living wrote on 2025-08-16, 15:30:

i tend to categorize 939 along with 423, AM1, FM1 and all the tik tok bullshit from Intel. You know they are gonna be expensive once they are discontinued.

I agree. That's also why i want it 😀

The boards are not all that rare though, at least where i live. Higher end dual core CPUs are more of an issue. Nobody sells opterons out here, while athlons are rare and ridiculously expensive. That's why one good deal on a couple of CPUs (+motherboard as a bonus) started this adventure...

Also yeah, this system is going to be bumpgate special 😀 Chipset, GPU... well, if it dies i still have viable (and likely very reliable) alternative with AGP/HD3850.

Reply 31 of 38, by Living

User metadata
Rank Member
Rank
Member
Archer57 wrote on 2025-08-16, 15:41:
Living wrote on 2025-08-16, 15:30:

am2 is a completely different history, there are many boards i bought 17 years ago that are still in service in custom pcs for clients.

939 suffered from the bumpgate in full force and many people jumped from 478, Socket A and 754 directly to Am2, Am3 / 775 due to cost and longevity of the platforms (those platforms were useful until at least 2009 since there was no real need for a dual core at the time). Thats why you dont see many boards.

939 was weird. I remember considering an upgrade from S462 back then. Early single core CPUs made very little sense. They were not all that faster than AthlonXP and all the new stuff they offered like x64 etc was still completely useless. Then x2 came out and they were ridiculously expensive. I wanted one, but $1000 was definitely too much. Then AM2 happened with cheap dual cores... so yeah, i went from S462 to AM2 directly.

I was under the impression that early AM2 stuff was just as affected by bumpgate as late S939, it was not fixed for a few years... perhaps there are different factors at play here, like cooling. The board i have absolutely roasts the chipset unless measures are taken to cool it with a fan.

i did the upgrade you skipt almost for free. Went from an Athlon XP 3000+ on an Asus A7N8X-E Deluxe (Nforce 2 400 Ultra) to an Athlon 64 3000+ Venice with that same DFI. I didnt notice ANY difference in windows, only in games mostly for the jump from the MSI Radeon 9600XT to the XFX Geforce 6800GS. 6 months later bought the Opteron and THEN i noticed some differences but mostly when i was using DVD Shrink and compressing with Winrar. Only i justified the jump when i bought the Evga 8800Gts 320MB (pictured there).

resuming: you did right, i could skip 939 and i would not make a difference because by the time the 8800 arrived, it was am2 and cheap dual cores time.

Last edited by Living on 2025-08-16, 15:52. Edited 1 time in total.

Reply 32 of 38, by AlexZ

User metadata
Rank Oldbie
Rank
Oldbie

On my Athlon 64 3400+ with GeForce GTX 275:
- F.E.A.R. (2005) - in 1600x1200 with max settings 96 fps average, 40 fps minimum (98% above 40 fps) with maximum details except for FSAA which was off. With FSAA set to 2x, average fps drops to 84. With FSAA set to 4x, average fps drops to 76.

The results of Crysis and F.E.A.R. show that a slow CPU with an overpowered GPU can be a surprisingly strong competitor. 1600x1200 is comparable to 1920x1080 as the pixel count is very close.

Pentium III 900E, ECS P6BXT-A+, 384MB RAM, GeForce FX 5600 128MB, Voodoo 2 12MB, Yamaha SM718 ISA
Athlon 64 3400+, Gigabyte GA-K8NE, 2GB RAM, GeForce GTX 275 896MB, Sound Blaster Audigy 2 ZS
Phenom II X6 1100, Asus 990FX, 32GB RAM, GeForce GTX 980 Ti

Reply 33 of 38, by Archer57

User metadata
Rank Member
Rank
Member
AlexZ wrote on 2025-08-16, 15:52:

On my Athlon 64 3400+ with GeForce GTX 275:
- F.E.A.R. (2005) - in 1600x1200 with max settings 96 fps average, 40 fps minimum (98% above 40 fps) with maximum details except for FSAA which was off. With FSAA set to 2x, average fps drops to 84. With FSAA set to 4x, average fps drops to 76.

The results of Crysis and F.E.A.R. show that a slow CPU with an overpowered GPU can be a surprisingly strong competitor. 1600x1200 is comparable to 1920x1080 as the pixel count is very close.

Yep, single 8800gt is not fast enough while SLI probably has the same overhead as it does in crysis. When i am done with all the old stuff i'll try that second GTX660 i have laying around and see how that goes. I am not going to use it for the final build though.

Reply 34 of 38, by Repo Man11

User metadata
Rank l33t
Rank
l33t

My ultimate performance test for my Gigabyte 939 system was playing Half-Life 2 on it. I initially thought I'd be able to play it at 1280x1024 with no issues, but that turned out to be wildly optimistic. So began the process of choosing the DDR RAM carefully, overclocking, upgrading from a 4400+ to the Opteron 180, upgrading the cooling to an AM2 copper cooler, still having to overclock it, and reducing the resolution to 1024x768 and turning some of the setting to low.

Even with all of that, there are scenes where there are stutters to the point of affecting game play, one example being when the Elites show up on the catwalk of the warehouse near the end of Anticitizen One. But it was good enough that I was willing to finish the game a couple of times on that system, though only after I had overclocked it.

After watching many YouTube videos about older computer hardware, YouTube began recommending videos about trains - are they trying to tell me something?

Reply 35 of 38, by nfraser01

User metadata
Rank Member
Rank
Member

So although Socket 939 came out in 2004, the X2 CPU's didn't arrive until 2005. That would make an X1950 Pro period correct. I'd probabkly start there personally.

Also note a lot of the 939 CPU's overheated and self-destructed I think?

Fun build though 😀

Reply 36 of 38, by Archer57

User metadata
Rank Member
Rank
Member

So, a few updates...

3870x2 is dead. More precisely - second GPU is. Works with crossfire disabled, but that is not fun.

Regular 3870 also kicked the bucket, which was coming - it was unstable from the start and probably was retired for this reason.

Cooling on this cards is atrocious and i'd definitely not recommend anyone to get any of them for any practical used. They are seemingly intentionally built to fail, no wonder they are more rare than nvidia cards from the same time period, bumpgate and all.

Also did this to 3850:

The attachment 20250818_031414_D.jpg is no longer available

It does not look very nice, but works extremely well - <40C idle, <50C load and almost completely quiet. Fan control also works. No RAM sinks, but RAM is under direct airflow and is completely cold.

Why? Well, because stock cooler is atrocious. Hot, loud, and annoying to clean. The worst thing - minimal fan speed is way too high on this card (even what can be set with afterburner), this is probably good for the card, unlike those 3870 it did not cook itself at idle, but it was unusably loud. Also the cooler is simply too small for the card - under load it sounded like a hairdryer and still reached 80+C.

Also tested how 4200+ CPU would compare and the difference is quite noticeable. For example it gets ~11000 in 3dmark 2005 while 4800+ with the same card gets ~13000. Even overclocked to 2.5Ghz, which is higher than 2.4Ghz of 4800+, it is still slower. Apparently cache matters.

Still have not made a final decision what to use for the build, both options have advantages and disadvantages, while performance seems to be remarkably similar...

Reply 37 of 38, by AlexZ

User metadata
Rank Oldbie
Rank
Oldbie

It made a similar airflow fix in my older GeForce 9800 GT. The original fan was small, noisy and could not be fixed. I used a slim fan though as full size fan would have produced too much noise from sucking air through the little space available. I need 2 PCI slots below PCIe. I also used just plastic strips for fastening.

There is a reason why I went with s754 3400+ with 1MB cache. I just don't trust AMD optimizations that are supposed to be worth the extra 512kb L2 cache, similarly to Brisbane vs Windsor. In the case of s939, it's the same technology. Always go for 1MB L2 cache per core.

For s939 to be meaningful it should beat s754. This means you need Athlon 64 X2 4800+, pcie and GeForce 2xx or better.

Cost wise s939 doesn't seem to be practical as 4800+, 4400+ with 1MB L2 cache per core are stupidly expensive, just like they were back in the day. Opteron 156 3Ghz isn't available at all. You are lucky to have that 4800+.

Pentium III 900E, ECS P6BXT-A+, 384MB RAM, GeForce FX 5600 128MB, Voodoo 2 12MB, Yamaha SM718 ISA
Athlon 64 3400+, Gigabyte GA-K8NE, 2GB RAM, GeForce GTX 275 896MB, Sound Blaster Audigy 2 ZS
Phenom II X6 1100, Asus 990FX, 32GB RAM, GeForce GTX 980 Ti

Reply 38 of 38, by Archer57

User metadata
Rank Member
Rank
Member
AlexZ wrote on Yesterday, 20:18:

It made a similar airflow fix in my older GeForce 9800 GT. The original fan was small, noisy and could not be fixed. I used a slim fan though as full size fan would have produced too much noise from sucking air through the little space available. I need 2 PCI slots below PCIe. I also used just plastic strips for fastening.

This was a little more than airflow fix, here is the original cooler:

The attachment 20250818_113730_D.jpg is no longer available
The attachment 20250818_113712_D.jpg is no longer available

That cooler i installed was from HD6850 or something like that. Required a bit of modding too since it did not clear all the components (original cooler actually has cutouts for them too).

Since i have standard ATX board and only going to use one PCI for sound i did not really care about size here...

Having now tested the card like this i also wonder - were the pads for the memory simply a marketing feature? Were they useful? Were they harmful? The memory is barely warm to the touch, while with that original cooler it was heated up by GPU to the point where whole card was too hot to touch under load...

AlexZ wrote on Yesterday, 20:18:

There is a reason why I went with s754 3400+ with 1MB cache. I just don't trust AMD optimizations that are supposed to be worth the extra 512kb L2 cache, similarly to Brisbane vs Windsor. In the case of s939, it's the same technology. Always go for 1MB L2 cache per core.

For s939 to be meaningful it should beat s754. This means you need Athlon 64 X2 4800+, pcie and GeForce 2xx or better.

Yeah, this is the issue which was discussed in this thread regarding S462, S754 and S939. Despite all its advances like x64, integrated memory controller, instruction sets, etc neither S754 nor S939 were a reasonable upgrade for high-end S462 at the time. Take a look at performance of any single core athlon64 with 512KB of cache and 2.2Ghz or less and gains compared to 2.2Ghz AthlonXP would be... very small. And 1MB 2.4Ghz models were the very top end and quite expensive, while still being only marginally better.

The only real significant change was pci-e, but while it certainly is beneficial now back then it was less so. This forced card replacement and also at that point all the stuff was still coming out in both variants with very similar if not the same performance, so instead of replacing whole system old one could simply be upgraded with new GPU...

Dual cores were meaningful since they provided other benefits - they radically improved general experience using the system and multitasking. But the cost on S939 was ridiculously high and single thread performance still very similar...

Real significant improvement happened with AM2 - DDR2, higher frequencies, reasonable prices...

AlexZ wrote on Yesterday, 20:18:

Cost wise s939 doesn't seem to be practical as 4800+, 4400+ with 1MB L2 cache per core are stupidly expensive, just like they were back in the day. Opteron 156 3Ghz isn't available at all. You are lucky to have that 4800+.

I am fully aware that S939 is not a sensible platform to build. Funnily enough it never was. And yes, i wanted that 4800+, that's why i waited so long...

It is fun though. Really cutting edge hardware back in the day, something most people were not willing to spend money on, but also a glimpse into the future with dual cores which only became mainstream a couple years later with AM2.

How awesome is that - being able to run a browser, a music player, monitoring tools, etc in the background while playing a game without it affecting the performance? We are used to it nowadays, but back then it felt like magic...

Small fun fact - this athlons advertise that they have hyperthreading to the OS in order to gain benefits from optimizations intel and microsoft worked on for P4HT.