VOGONS


Socket A: Nvidia vs Via - battle of the platforms!

Topic actions

Reply 1020 of 1041, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie
nd22 wrote on Yesterday, 15:07:
Trashbytes wrote on Yesterday, 15:01:
Archer57 wrote on Yesterday, 13:31:
A few comments... […]
Show full quote

A few comments...

Well, i totally understand your point of view and why it makes sense to recommend something you have experience with, but... there are plenty of good boards out there. IMO for someone building such a system without strong personal preferences trying to find something specifically from Abit makes very little sense.

It is "nforce2 ultra 400", not "nforce2 ultra" (does not exist), "nforce2", or "nforce2 400". This is very important because motherboard manufacturers tend to throw words like "ultra" in just for good measure. "nforce2 ultra 400" is the only north bridge which supports 400FSB and dual channel. Regular "nforce2 400" is single channel, "nforce2" - dual channel but 333. It is really easy to get confused here so have to be careful. Even more confusing - it seems that some boards switched to "nforce2 ultra 400" (which is newer) from "nforce2" at some point, for example EPoX EP-8RDA i have has "nforce2 ultra 400" (i even removed the heatsink to confirm), even though all info i can find online shows it should have regular "nforce2".

Also SoundStorm only matters if PCI soundcard is not going to be used. If, for example, something from audigy series is planned - there is no point, integrated audio is going to be disabled anyway. Might as well pick one with sata ports - those from nvidia actually work well with any modern devices, unlike ones from via. And also do not require any drivers to work, unlike silicon image stuff. So very convenient.

MCP2-S/R and MCP2-GB also have 8 USB ports instead of 6 on MCP2/MCP2-T, which can be quite handy depending on case and needs.

I am pretty happy with EPoX EP-8RDA3I myself. 12V VRM, very nice monitoring and ability to fiddle with everything like all the frequencies, voltages, etc in bios, nforce2 ultra 400 north bridge... the only disadvantage is regular MCP2 southbridge so no sata and no soundstorm, but soundstorm would be useless for me anyway so no big deal.

Motherboard is one of those parts which does not affect performance as long as it works properly and has the same chipset (especially with no overclocking), so a lot of options here and a lot depends on personal preference.

I am curious who is actual chip manufacturer. I've had very good success with sticks from samsung myself, so perhaps that's another option.

I always prefer sticks from actual chip manufactures because this way there is way less lottery involved. With manufacturers like corsair who does not have their own chips you can get totally differen ram which works completely differently with the same name, which is annoying.

Have to be careful though, nforce2 is very picky. I have whole box of DDR1 and like 2/3 of it does not work properly on boards with this chipset. So sharing what tends to work more reliably can actually be very useful to avoid buying "few Kg of DDR1".

Timings... in my extremely limited testing i was not able to find significant difference between sensible values like 3-3-3-8-1, 2.5-3-3-7-1 and 2-2-2-2-5-1 on my system (if you use the force more voltage it is possible to make regular 3-3-3-8 DDR400 to work with 2-2-2-2-5-1 timings, at least some sticks).

Hmm yes .. forgot about about that abomination well not abomination but its has its quirks, its about the only exception here where you do have to go hunting for sticks that it likes, I feel this is a nVidia issue as a lot of their chipsets had this problem ..even the 775 nforce chipsets had issues with DDR2/3 compatibility.

This is the only drawback of the Abit AN7. Until I used Corsair sticks I spent days testing I can not remember how many modules and none of them worked! Even with increased voltage none of them worked! Than I bought a Corsair stick and it worked!! From that moment I bought a few lots of Corsair modules and all of them work in my nforce boards.

IIRC nVidia never actually fixed the issue either and did what they normally do ...turned tail and left the scene which is what I expect they will do with their consumer gaming GPUs very shortly. AI is too lucrative for them to let consumer GPUs keep eating into their AI hardware profits. For the best really they seem to have really just phones it in for the RTX 5000 GPUs with little to no innovation or improvement to their raster ops, though if you LOVE AI fakery they have that in spades with their DLSS MFG.

Its not real frames but that doesnt matter right.

All this talk of Socket A makes me want to dig out my MSI KT880 3200+ Barton board and throw the 5900 XT I just got at it, its a nice setup. I wonder how a 6800 Ultra would run in it and if the 3200 can feed it fast enough.

Reply 1021 of 1041, by Archer57

User metadata
Rank Member
Rank
Member
nd22 wrote on Yesterday, 15:02:

1. I can not and will not recommend something I never worked with nor tested. My recommendations are based entirely on what I have used personally. While I do understand that there are many socket 462 boards out there I can not say: use any nforce2 board you will find, it will be very bad advice as long as I have not tested it! Everything I said in this topic, including these final recommendations, are based on my personal experience! If I said Abit KV7 is the best all rounder board I said it because I tested it.
2. As I already said in this topic I tested the chipsets. Because sound storm is included in the MCP-T, that provided some of the advantage over competing chipsets. If you want to use a Creative audigy, please go ahead and use it.
3. Already answered that but I am going to say it again: all, and I mean all, Corsair modules worked in the AN7/NF7-S 2.0/NF7-S2G - with the lowest possible voltage I must add - regardless of revision! So my advice is pretty simple: based on my personal experience I said: go with Corsair it will work in any of the boards I recommended! I stand by that advice.

Just to be clear - i am not here to argue, judge or try to convince anyone something is "right" or "wrong". Sorry if it comes across that way. Just wanted to share my own experience with this platform and thoughts based on that. Perhaps different opinion/point of view on some things like that situation with audio. If you think it is not appropriate just say and i'll stop.

Trashbytes wrote on Yesterday, 15:15:

All this talk of Socket A makes me want to dig out my MSI KT880 3200+ Barton board and throw the 5900 XT I just got at it, its a nice setup. I wonder how a 6800 Ultra would run in it and if the 3200 can feed it fast enough.

Well, i can tell that there is very large difference between 5900XT and 7600GT. I am definitely seeing CPU limitations in some stuff that's possible to run with 7600GT, but also a lot of stuff that runs horribly on 5900XT runs very well with 7600GT. And that's probably pretty good approximation of 6800ultra, which i do not have.

Reply 1022 of 1041, by AlexZ

User metadata
Rank Oldbie
Rank
Oldbie
Trashbytes wrote on Yesterday, 15:15:

All this talk of Socket A makes me want to dig out my MSI KT880 3200+ Barton board and throw the 5900 XT I just got at it, its a nice setup. I wonder how a 6800 Ultra would run in it and if the 3200 can feed it fast enough.

Definitely do it, I welcome a fact based discussion. Please run 3dmark 2003, 3dmark 2005 with 3200+ Barton in 1024x768 and 1600x1200 for 5900 XT. Take screenshot including game fps details, not just the final score. 3dmark 2003 represents games where Athlon XP should shine and 3dmark 2005 games where it will struggle due to being CPU bottlenecked (2005 and later games).

Pentium III 900E, ECS P6BXT-A+, 384MB RAM, GeForce FX 5600 128MB, Voodoo 2 12MB, Yamaha SM718 ISA
Athlon 64 3400+, Gigabyte GA-K8NE, 2GB RAM, GeForce GTX 260 896MB, Sound Blaster Audigy 2 ZS
Phenom II X6 1100, Asus 990FX, 32GB RAM, GeForce GTX 980 Ti

Reply 1023 of 1041, by nd22

User metadata
Rank Oldbie
Rank
Oldbie
Archer57 wrote on Yesterday, 15:23:

Just to be clear - i am not here to argue, judge or try to convince anyone something is "right" or "wrong". Sorry if it comes across that way. Just wanted to share my own experience with this platform and thoughts based on that. Perhaps different opinion/point of view on some things like that situation with audio. If you think it is not appropriate just say and i'll stop.

I will repeat myself again: Did I make mistakes? Of course I did. I welcome any critique or opinion as I am sure that there are many people on this forum far more knowledgeable about socket A that I am and did far more tests that I did.
Please feel free to share your thoughts and experiences with socket A!
What I must underline is the fact that everything I say or recommend is based on what I have tested personally. My recommendation for the ultimate socket A system is going to be Abit AN7 and not Asus A7N8X deluxe because that I did not test that board. So my advice is very specific, if a user decides to follow it it will get in games and benchmarks results very close to what I got. As you will see shortly when i make a few recommendations for video cards, some very well known GPU's will be missing simply because I do not have them so I did not test them.

Last edited by nd22 on 2025-06-14, 03:02. Edited 1 time in total.

Reply 1024 of 1041, by nd22

User metadata
Rank Oldbie
Rank
Oldbie
Trashbytes wrote on Yesterday, 15:15:
IIRC nVidia never actually fixed the issue either and did what they normally do ...turned tail and left the scene which is what […]
Show full quote
nd22 wrote on Yesterday, 15:07:
Trashbytes wrote on Yesterday, 15:01:

Hmm yes .. forgot about about that abomination well not abomination but its has its quirks, its about the only exception here where you do have to go hunting for sticks that it likes, I feel this is a nVidia issue as a lot of their chipsets had this problem ..even the 775 nforce chipsets had issues with DDR2/3 compatibility.

This is the only drawback of the Abit AN7. Until I used Corsair sticks I spent days testing I can not remember how many modules and none of them worked! Even with increased voltage none of them worked! Than I bought a Corsair stick and it worked!! From that moment I bought a few lots of Corsair modules and all of them work in my nforce boards.

IIRC nVidia never actually fixed the issue either and did what they normally do ...turned tail and left the scene which is what I expect they will do with their consumer gaming GPUs very shortly. AI is too lucrative for them to let consumer GPUs keep eating into their AI hardware profits. For the best really they seem to have really just phones it in for the RTX 5000 GPUs with little to no innovation or improvement to their raster ops, though if you LOVE AI fakery they have that in spades with their DLSS MFG.

Its not real frames but that doesnt matter right.

All this talk of Socket A makes me want to dig out my MSI KT880 3200+ Barton board and throw the 5900 XT I just got at it, its a nice setup. I wonder how a 6800 Ultra would run in it and if the 3200 can feed it fast enough.

Please put that system together! I got a bunch of results on Abit KW7 and we can compare them! The only problem is: I do not have a geforce 5900XT.

Reply 1025 of 1041, by nd22

User metadata
Rank Oldbie
Rank
Oldbie

VIDEO CARD – After I said what I consider that people building a socket 462 system should be looking for in a video card here is my classification of AGP cards based on what I have and tested myself:

Tier 1: geforce 7800gs/7900 series/radeon X1950 series/X850 series series. Extremely expensive with prices starting at 100 USD and going up every single month, these cards are reserved only for those with deep pockets wanting to build the ultimate AGP system. Also you must take into consideration that all these cards are bottlenecked even by the Athlon XP 3200 and it is not worth to pair such rare and expensive cards with a socket A platform.
I can not recommend any of the above cards, not even four your dream socket 462 machine!
You have seen the geforce 7800gs in hundreds of screenshots on this topic and while it is a very good card that eliminates GPU bottleneck its performance will be limited even by the Athlon XP 3200.
I got a gainward geforce 7900gs and because I got it much later you have not seen that card used for testing; however because of the weak CPU's relative to geforce 7900gs power it is even more limited in performance by the Athlon XP.
I also got a radeon X1950 pro with 512mb from Saphire. Needless to say, that card it is even more CPU limited.
Radeon X850 XT and XP PE that I got are in PCI-express form factor. From testing on an Athlon 64 their performance is so much better than geforce 6800 series that I included them here. Also limited by the K7 architecture.

Reply 1026 of 1041, by nd22

User metadata
Rank Oldbie
Rank
Oldbie

Tier 2: geforce 6800GT/ultra/7600GT: I have the geforce 7600GT from Leadtek and a geforce 6800GT reference model; from Ati I do not have any AGP or PCI – express card from the X800 series so I can not comment about them but if anyone got them and used them in their socket A system, feel free to comment on their performance.
In every single test 7600GT is faster than 6800GT; taking into account that it is also far cheaper because 6800GT AGP is considered a collector item as it has full support for Windows 98, draws less power, produces less noise geforce 7600GT is the ideal card with which Athlon XP 3200 should be paired. The balance is just perfect, there is no bottleneck and you can play all games from 2000 up to 2004 at max settings: 7600GT earns a full recommendation for everyone building the ultimate Athlon XP system.
Let me reiterate: 3200 and geforce 7600Gt on Abit AN7 are a perfect match and constitutes the dream Athlon XP machine! The downside is the price: 60 – 100 USD on eBay so I hope you have deep pockets or lots of luck.
I got a Geforce 6800 ultra in PCI – express format and its performance is really, really close to that of Geforce 7600gt (also PCI – express). Because of the rarity and price of the ultra I can not recommend it, 7600GT remains the card of choice for your dream socket A machine!

Reply 1027 of 1041, by Archer57

User metadata
Rank Member
Rank
Member

One card i can absolutely recommend is DDR3 version of 7300GT. Given how bad low end cards from nvidia generally tend to be this is often underestimated and sold very cheap, but it is actually a very good card. Uses slightly cut down G73 with the same memory as on 7600GT and on my system i've seen it perform very well - 3Dmark scores are ~90-70% of 7600GT (more difference in newer versions) and many real games are completely indistinguishable on both cards - either CPU limited or fast enough anyway.

Even more importantly - it allows to play games from intended period comfortably, unlike cards like 6600 or 5900XT (i'll put 5900XT into socketA test bench i still have set up and run 3dmark with 3200+ barton a bit later).

It is not "the ultimate" card, but for those on a limited budget IMO a very good option.

By the way one comment about memory on the cards - memory type is very, very important. I've seen 7600GT with DDR2, also AMD cards from HD series with DDR2 are very common - much more common than ones with DDR3 and often sold for the same or very similar price. Those should be avoided - performance difference is very large, even calling them the same card feels wrong. And for those cards where different bus widths versions exist - that is similarly very important.

Reply 1028 of 1041, by nd22

User metadata
Rank Oldbie
Rank
Oldbie

I have yet to see a 7600gt with ddr2 memory. I do not deny that they exist but I have not seen one. I got 2 geforce 7600gt, one from leadtek and one fron palit, both with stock clocks: 560 mhz core, 1400 mhz memory.
I also got a geforce 7600gs and that one comes with ddr2.
One other card that I have not seen is the ati radeon x1650xt in agp. My best friend has one in pci express format and is always in front of my pci express 7600gt.

Reply 1029 of 1041, by Archer57

User metadata
Rank Member
Rank
Member

Yeah, DDR2 is optional. From what i've seen in AGP 7 series 7300GT, 7600GT and 7600GS definitely exist in both variants. And DDR2 is always no good.

7600GT i have is probably the exact same one from palit, at least it has the same frequencies which are pretty much stock. There definitely were faster cards out there, but at least cooling is good on this one, which probably matters more.

Also here are 5900XT benchmarks:

The system
The attachment 3200+5900xt.jpg is no longer available
The benchmarks
The attachment 3200+5900xt-2001se.jpg is no longer available
The attachment 3200+5900xt-2003.jpg is no longer available
The attachment 3200+5900xt-2005.jpg is no longer available

As i expected it is quite bad. The card is expensive nowadays, often more expensive than 7600GT, and it absolutely fails to compete even with 7300GT, especially in newer stuff.

There also are issues which are not reflected well in benchmark scores like FPS drops in certain conditions - explosions, fog, etc. Like even in car chase in 2001SE on explosions FPS drops below 30 at times. This makes games feel much slower than FPS alone would suggest and does not happen on 7 series cards.

The only reason to get this card would be for compatibility with older stuff, probably for win98 build. There it will perform well enough too.

I'll post some benchmarks with 5900XT in the system i use it with, which is 2200+ thoroughbred-b on KT333CF board later, just to see just how much the results are affected by CPU.

Also a few caveats: i've used driver 175.19, which in this case the last driver to support this card (very new) and 3200+ is actually OC 2500+, but that should not matter.

It also showcases the previous point about 2500+ - this board has no way to adjust voltages and this CPU just works, perfectly stable with FSB set to 400.

I've also measured voltages while i was at it - i am using modern-ish SFF 300W PSU which absolutely can not handle 5V systems, in this case (12V CPU VRM, card is 3.3+5V) it worked great - 5V was within 5-5.1v, 12v - 12-12.1 during all the tests.

Reply 1030 of 1041, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie
Archer57 wrote on Today, 08:10:
Yeah, DDR2 is optional. From what i've seen in AGP 7 series 7300GT, 7600GT and 7600GS definitely exist in both variants. And DDR […]
Show full quote

Yeah, DDR2 is optional. From what i've seen in AGP 7 series 7300GT, 7600GT and 7600GS definitely exist in both variants. And DDR2 is always no good.

7600GT i have is probably the exact same one from palit, at least it has the same frequencies which are pretty much stock. There definitely were faster cards out there, but at least cooling is good on this one, which probably matters more.

Also here are 5900XT benchmarks:

The system
The attachment 3200+5900xt.jpg is no longer available
The benchmarks
The attachment 3200+5900xt-2001se.jpg is no longer available
The attachment 3200+5900xt-2003.jpg is no longer available
The attachment 3200+5900xt-2005.jpg is no longer available

As i expected it is quite bad. The card is expensive nowadays, often more expensive than 7600GT, and it absolutely fails to compete even with 7300GT, especially in newer stuff.

There also are issues which are not reflected well in benchmark scores like FPS drops in certain conditions - explosions, fog, etc. Like even in car chase in 2001SE on explosions FPS drops below 30 at times. This makes games feel much slower than FPS alone would suggest and does not happen on 7 series cards.

The only reason to get this card would be for compatibility with older stuff, probably for win98 build. There it will perform well enough too.

I'll post some benchmarks with 5900XT in the system i use it with, which is 2200+ thoroughbred-b on KT333CF board later, just to see just how much the results are affected by CPU.

Also a few caveats: i've used driver 175.19, which in this case the last driver to support this card (very new) and 3200+ is actually OC 2500+, but that should not matter.

It also showcases the previous point about 2500+ - this board has no way to adjust voltages and this CPU just works, perfectly stable with FSB set to 400.

I've also measured voltages while i was at it - i am using modern-ish SFF 300W PSU which absolutely can not handle 5V systems, in this case (12V CPU VRM, card is 3.3+5V) it worked great - 5V was within 5-5.1v, 12v - 12-12.1 during all the tests.

I have a 7900GS 256 DDR3 AGP in a box around here I may dig it out and see what it can do ...assuming all the parts still work. Been going through my Motherboards and GPUs of late and a few of them now need recaps. Have a nice EPoX EP-8RDA+Pro that has gone bad in storage and now needs most of its VRM recapped, board was ok when stored. Another bit is working 9600 Pro that blew two of its caps when I tried to test it a few weeks ago, it too is now on the repair shelf.

I guess thats just how it goes with retro kit but it makes me not want to touch my GPU collection as I already have too many bits on my repair shelf.

The 5900XT numbers are pretty much what I expected 🤣 .. its still a FX card after all and they just outright sucked at DX9 stuff and even the mighty FX 5950 couldn't brute force past the failures nvidia lumped on it. (Throw 3dmark 2000 SE at the 5900XT for shits and giggles)

The 2500+ is just a 3200+ with lower multipliers .. the two chips are identical other than that and many 2500+ CPUs were in fact just downclocked 3200+ CPUs so seeing it run at 3200+ speeds isnt surprising and by rights it should OC a bit more.

Reply 1031 of 1041, by Archer57

User metadata
Rank Member
Rank
Member
Trashbytes wrote on Today, 08:23:

The 5900XT numbers are pretty much what I expected 🤣 .. its still a FX card after all and they just outright sucked at DX9 stuff and even the mighty FX 5950 couldn't brute force past the failures nvidia lumped on it. (Throw 3dmark 2000 SE at the 5900XT for shits and giggles)

The most amusing results are in 2005, it is just at a point where transitioning from frames per second to seconds per frame may make sense. Though to be fair is does have only 128MB vram, so that may be contributing here.

I'll see if i can run 2000, i remember having trouble with it on XP and i do not have 98 here.

Trashbytes wrote on Today, 08:23:

The 2500+ is just a 3200+ with lower multipliers .. the two chips are identical other than that and many 2500+ CPUs were in fact just downclocked 3200+ CPUs so seeing it run at 3200+ speeds isnt surprising and by rights it should OC a bit more.

It actually has the same multiplier - 11. The beauty of specific chip, AXDA2500DKV4D, is that it can mimic 3200+ completely, for those who want "the fastest socketA CPU" but do not want to pay for it. It is not surprising but it is fun - the lowest end barton becoming the highest end one by just switching FSB from 333 to 400 with no extra actions required and no special requirements from the motherboard (any motherboard will be able to switch between standard 200/266/333/400 FSB speeds).

And yeah, on a different board with more control + some extra voltage it could go higher...

In fact based on what i've seen i'd call the built like this "dream socketA system on a budget" - using this CPU + 7300GT (definitely not 5900XT) is very cheap. Combined with any nforce2 ultra 400 motherboard it is like 90% of what much more expensive system with real 3200+ CPU, 7600GT+, fancy motherboard, etc can do for like 10% of the cost.

Trashbytes wrote on Today, 08:23:

I guess thats just how it goes with retro kit but it makes me not want to touch my GPU collection as I already have too many bits on my repair shelf.

Yeah, that is and it is indeed annoying. Theoretically the way to go is to just replace capacitors with nice new ones right away, but it takes time, money and i am way too lazy... so have to just deal with failures as they happen...

This is also the reason i like to replace parts that are easy and cheap to replace with modern ones in my builds - PSUs, HDDs, fans, etc. Keeps them working much more reliably.

Reply 1032 of 1041, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie
Archer57 wrote on Today, 08:41:
The most amusing results are in 2005, it is just at a point where transitioning from frames per second to seconds per frame may […]
Show full quote
Trashbytes wrote on Today, 08:23:

The 5900XT numbers are pretty much what I expected 🤣 .. its still a FX card after all and they just outright sucked at DX9 stuff and even the mighty FX 5950 couldn't brute force past the failures nvidia lumped on it. (Throw 3dmark 2000 SE at the 5900XT for shits and giggles)

The most amusing results are in 2005, it is just at a point where transitioning from frames per second to seconds per frame may make sense. Though to be fair is does have only 128MB vram, so that may be contributing here.

I'll see if i can run 2000, i remember having trouble with it on XP and i do not have 98 here.

Trashbytes wrote on Today, 08:23:

The 2500+ is just a 3200+ with lower multipliers .. the two chips are identical other than that and many 2500+ CPUs were in fact just downclocked 3200+ CPUs so seeing it run at 3200+ speeds isnt surprising and by rights it should OC a bit more.

It actually has the same multiplier - 11. The beauty of specific chip, AXDA2500DKV4D, is that it can mimic 3200+ completely, for those who want "the fastest socketA CPU" but do not want to pay for it. It is not surprising but it is fun - the lowest end barton becoming the highest end one by just switching FSB from 333 to 400 with no extra actions required and no special requirements from the motherboard (any motherboard will be able to switch between standard 200/266/333/400 FSB speeds).

And yeah, on a different board with more control + some extra voltage it could go higher...

In fact based on what i've seen i'd call the built like this "dream socketA system on a budget" - using this CPU + 7300GT (definitely not 5900XT) is very cheap. Combined with any nforce2 ultra 400 motherboard it is like 90% of what much more expensive system with real 3200+ CPU, 7600GT+, fancy motherboard, etc can do for like 10% of the cost.

Trashbytes wrote on Today, 08:23:

I guess thats just how it goes with retro kit but it makes me not want to touch my GPU collection as I already have too many bits on my repair shelf.

Yeah, that is and it is indeed annoying. Theoretically the way to go is to just replace capacitors with nice new ones right away, but it takes time, money and i am way too lazy... so have to just deal with failures as they happen...

This is also the reason i like to replace parts that are easy and cheap to replace with modern ones in my builds - PSUs, HDDs, fans, etc. Keeps them working much more reliably.

There is one weird thoroughbred 2800+ that is rated for 2250Mhz I have never seen one in the wild but IIRC a member here got one.

Been on the hunt for one myself or the OEM 3200+ with the 333 FSB thats rated for 2333Mhz.

Sadly Im not that great at hunting Unicorns 🤣

Reply 1033 of 1041, by Archer57

User metadata
Rank Member
Rank
Member
Trashbytes wrote on Today, 10:33:

There is one weird thoroughbred 2800+ that is rated for 2250Mhz I have never seen one in the wild but IIRC a member here got one.

Been on the hunt for one myself or the OEM 3200+ with the 333 FSB thats rated for 2333Mhz.

Sadly Im not that great at hunting Unicorns 🤣

Everything depends on if you want actual collectible item though. If you do then yeah, have to do some hunting and be ready to spend some money. I was lucky enough to find a guy who apparently did not understand the value. Was buying other stuff from him (DDR1 in large quantities for the nforce2). Asked him if he had CPUs and he gave me a box with a bunch of fun stuff - 3200+, 3000+ 333FSB, a bunch of those 2500+ ones... he sold whole box for $50 and was actually really happy. Nothing too rare though, like no 333FSB 3200+...

If, however, you just want to play around with it, see how it performs and perhaps use it - you can always find variants with the same multiplier and one step lower FSB. Those tend to be relatively easy to get and usually overclock without issues. For example there is AXDA2500DKV4C (266FSB 2500+) which would probably mimic 333FSB 3200+ unicorn just fine.

Reply 1034 of 1041, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie
Archer57 wrote on Today, 11:25:
Trashbytes wrote on Today, 10:33:

There is one weird thoroughbred 2800+ that is rated for 2250Mhz I have never seen one in the wild but IIRC a member here got one.

Been on the hunt for one myself or the OEM 3200+ with the 333 FSB thats rated for 2333Mhz.

Sadly Im not that great at hunting Unicorns 🤣

Everything depends on if you want actual collectible item though. If you do then yeah, have to do some hunting and be ready to spend some money. I was lucky enough to find a guy who apparently did not understand the value. Was buying other stuff from him (DDR1 in large quantities for the nforce2). Asked him if he had CPUs and he gave me a box with a bunch of fun stuff - 3200+, 3000+ 333FSB, a bunch of those 2500+ ones... he sold whole box for $50 and was actually really happy. Nothing too rare though, like no 333FSB 3200+...

If, however, you just want to play around with it, see how it performs and perhaps use it - you can always find variants with the same multiplier and one step lower FSB. Those tend to be relatively easy to get and usually overclock without issues. For example there is AXDA2500DKV4C (266FSB 2500+) which would probably mimic 333FSB 3200+ unicorn.

I have a couple of items that might fit the collectible tag but I didn't buy them to throw them on a shelf, I like to use the things I buy which is why I don't normally buy NOS or NIB items. I leave them for the collectors who want to keep them as is since I do understand the value in such things. (Even if i dont agree with buying something to throw it in collection and never use it)

But If I ever got my hands on a Unicorn 3200+ .. not sure I could resist firing it up and giving it some gas.

I could emulate a unicorn 3200+ with the 333FSB 2500 by unlocking the multipliers on it which isn't hard to do with the early production 2500s before they all got superlocked.

Reply 1035 of 1041, by nd22

User metadata
Rank Oldbie
Rank
Oldbie
Trashbytes wrote on Today, 10:33:

There is one weird thoroughbred 2800+ that is rated for 2250Mhz I have never seen one in the wild but IIRC a member here got one.

Been on the hunt for one myself or the OEM 3200+ with the 333 FSB thats rated for 2333Mhz.

Sadly Im not that great at hunting Unicorns 🤣

There are some very rare processors on socket 462and T-bred 2800 is one of them. I have a box full of CPU's but I do not have any limited editions one!

Reply 1036 of 1041, by Archer57

User metadata
Rank Member
Rank
Member

So, as promised, FX5900XT on slower system:

The system
The attachment FX5900XT_AXP2200.jpg is no longer available
The benchmarks
The attachment FX5900XT_AXP2200_2001.jpg is no longer available
The attachment FX5900XT_AXP2200_2003.jpg is no longer available
The attachment FX5900XT_AXP2200_2005.jpg is no longer available

Even worse, as expected, but at least here the card makes some sense for win98 stuff.

Reply 1037 of 1041, by Archer57

User metadata
Rank Member
Rank
Member

And then i thought i'd compare it to... something. While i have all the stuff still set up. So i came up with the idea to see how 6600 non-GT with really slow 128bit DDR1 would perform. This card was released less then a year after 5900XT, has dramatically slower memory and is in general low-end. How bad can it be?

The system
The attachment 3200+6600.jpg is no longer available
The benchmarks
The attachment 3200+6600-2001.jpg is no longer available
The attachment 3200+6600-2003.jpg is no longer available
The attachment 3200+6600-2005.jpg is no longer available

Yeah, FX bad. Do not get anything FX if compatibility with really old stuff is not the goal. It is amusing how the card with such slow memory manages to be ~similar in old and so much better in new stuff. At least we are back with frames per second here, instead of seconds per frame, in 2005...

Reply 1038 of 1041, by nd22

User metadata
Rank Oldbie
Rank
Oldbie

Tier 3: geforce 6600GT/5900 series/radeon X1650 series/9800 series: I have the geforce 6600GT from Leadtek with "correct"clocks - 500 MHz for the core and 1000 MHz for the memory, so it is not under clocked like other 6600GT - and the drivers, power draw and noise is more than fine; in addition the price is relatively low: 40 – 60 USD on eBay for a good AGP video card in 2025 is OK. It can run all games from 2000 – 2003 at max settings at high frame rates so it earns a full recommendation as it fits perfectly in the option 2: best price to performance ratio system.
Radeon X1650 pro that I have is highly problematic with the drivers and has an extreme lag when alt – tabbing out of a full screen application such as a game! I simply do not like my radeons with Rialto Bridge in a socket A system, I prefer geforce cards all day.
Geforce 5900 with its many variants – standard/ultra/5950 ultra but not XT can also run 2000 – 2003 games pretty well; it is however a collectible item so prices are steep, power draw is huge and FX series is, after all, a failure. The only advantage it has is: native AGP so no HSI bridge! I have a 5900 ultra and a 5950 ultra and they perform well for the aforementioned period however the noise is simply unbearable!
Radeon 9800 standard/pro/XT but not SE are very good and cover the same 2000 – 2003 period but have some of the same drawbacks as geforce 5900 series: steep price, collectible item, rare. It is also native AGP so no Rialto Bridge and performance is simply outstanding! I got a 9800 PRO with 128 MB and aftermarket cooling from Zalman; it is very quiet and performance is very good; my 9800XT is a standard card with default ATI cooler; performance is amazing in every single game up to 2003. Ati deserves all the praise for the 9000 series: quiet, high performance, single slot. Still, they cover the same period as the 5000 series from NVIDIA, as soon as you get to 2004 max settings with good performance is out of the question.
The only card I recommend is the geforce 6600GT, best price to performance ratio out of all AGP video cards! I should mention the drivers, the ones from NVIDIA are very good!

Reply 1039 of 1041, by nd22

User metadata
Rank Oldbie
Rank
Oldbie

Tier 4: radeon 9600XT/9800SE/geforce 5900XT: price, power draw, low noise and native AGP are the strong points of the 9600XT.
There aren’t any comparable cards from NVidia below 5900XT as the geforce 5200/5500/5600/5700/5800 series are not even worth mentioning. 9800SE is too expensive to recommend but 9600XT price is around 20 – 40 USD so pretty cheap for an AGP card. The major downside is performance: you will be limited to 2002 games at most if you want to play at max settings. Geforce 5900XT is a power hungry and noisy beast that is also too expensive too recommend.
Performance is too poor on all of them however 9600XT is my recommendation for anyone building a socket 462 for the first time, not having an AGP card and not wanting to spend lots of money before knowing that the platform is problem free and all components are fine.
I have from Abit 2 identical 9600XT: default clocks – 500 for the core and 600 for the memory – both with 256mb and an aftermarket Zalman cooler that I installed. They are very, very quiet, very cool and have excellent DirectX 7 and 8 performances even at high resolutions such as 1600*1200.