VOGONS


First post, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie

For whatever reason, I own a large number of these two particular cards, and I want to know which one would deliver better gaming performance overall in a typical Pentium 4 setup.

From what I understand, the Radeon 9000 is essentially a cut-down Radeon 8500, and thus is a DirectX 8-level card, while the Geforce FX5200 is a DirectX 9-level card, even though its DX9 performance is rather poor. 🤣

Reply 1 of 14, by fillosaurus

User metadata
Rank Member
Rank
Member

I know nothing about 9000. Never tested one. But 9200 and 9250 are faster than a FX5200. After my 3DMark tests, I can say 9250 (which, ironically, is slower than 9200) is just a bit slower than FX5500; and FX5500 is just a higher clocked FX5200.
IMO, go with the Radeon.

Y2K box: AMD Athlon K75 (second generation slot A)@700, ASUS K7M motherboard, 256 MB SDRAM, ATI Radeon 7500+2xVoodoo2 in SLI, SB Live! 5.1, VIA USB 2.0 PCI card, 40 GB Seagate HDD.
WIP: external midi module based on NEC wavetable (Yamaha clone)

Reply 2 of 14, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie
fillosaurus wrote:

I know nothing about 9000. Never tested one. But 9200 and 9250 are faster than a FX5200. After my 3DMark tests, I can say 9250 (which, ironically, is slower than 9200) is just a bit slower than FX5500; and FX5500 is just a higher clocked FX5200.
IMO, go with the Radeon.

That's kind of what I was thinking too. I remember the Halo demo being quite playable on my 2.4GHz Pentium 4 Prescott with 768MB of ram and a 128MB Radeon 9000, while it absolutely choked on a 3.0GHz Prescott with an FX5200 and 512MB of ram.

Reply 3 of 14, by NitroX infinity

User metadata
Rank Member
Rank
Member

If you look on Wikipedia;
http://en.wikipedia.org/wiki/Comparison_of_AM … _9xxx.29_Series
http://en.wikipedia.org/wiki/Comparison_of_Nv … 85xxx.29_Series

The FX5200 has a higher core-frequency (50MHz) but a lower memory-frequency (50MHz). The only big difference is that the 5200 also came in a version with a 64bit bus. Based on that, I would say it all depends on the bus of the 5200.

NitroX infinity's 3D Accelerators Arena | Yamaha RPA YGV611 & RPA2 YGV612 Info

Reply 4 of 14, by idspispopd

User metadata
Rank Oldbie
Rank
Oldbie

The Radeon 9000 also came on cards with 64bit bus. (Wikipedia doesn't tell you, but I accidentally bought one at the time. Fortunately I could return it after some discussion.) You'd have to check to be sure.

You are talking about a Radeon 9000 or 9000 Pro? The non-pro is usually passively cooled but clocked slower.
Cut-down is not completely right. ATI called it cost optimized, and I remember there were some benchmarks (not the majority) which ran faster on a 9000 Pro than on a 8500.
(A 9000 Pro should still use less power than a 8500.)

Reply 5 of 14, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie

I also vote for the 9000.

I have a Radeon 9250 PCI and an FX5200 AGP. When tested on an A64 3500+ system, the 9250 scores ~6300 in 3DMark01. The FX5200 only scores ~4200. I think the only reason it beats my GF2 MX400 is because it can actually run the Nature test.

And as mentioned above, the Radeon 9000 is even faster than the 9250.

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 6 of 14, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie

None of the Radeon 9000s I own are pro versions, and most of the the 5200s I have are 64-bit models, since they came from old Dell boxes. I have a few that aren't Dell OEM models, but I don't know for sure which of them are 64-bit models and which ones aren't.

On a related note, a friend gave me his old HP Pavilion a530n, which happens to have a Geforce FX5200XT. It seems to be a pretty snappy box in Win98SE with the latest unofficial service pack, though it's been a pain in the ass finding motherboard drivers for it that work properly without locking the system up or slowing it down. 🤣 Anyway, is the XT variant of the FX5200 any good, or is it just another crappy budget version?

Reply 8 of 14, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I'd say a 5200 Ultra would compare well with a 9000 Pro. NV3x has better anti aliasing and anisotropic than R200 based chips though.

Geforce FX should run Halo in D3D8 mode. If it does run D3D9 mode it will suffer badly.

Reply 9 of 14, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie
Putas wrote:
mr_bigmouth_502 wrote:

Anyway, is the XT variant of the FX5200 any good, or is it just another crappy budget version?

the worst

I was kind of expecting an answer like that. 🤣 Honestly, WTF is it with OEMs and crappy GPUs that are cut down versions of other well-known cards? 😜

Reply 10 of 14, by swaaye

User metadata
Rank l33t++
Rank
l33t++

OEMs needed a 2D card and desired the latest 3D features for their advertising, but the baseline model needed to be incredibly low cost. Still better than an IGP of the time.

Reply 11 of 14, by sliderider

User metadata
Rank l33t++
Rank
l33t++
mr_bigmouth_502 wrote:
Putas wrote:
mr_bigmouth_502 wrote:

Anyway, is the XT variant of the FX5200 any good, or is it just another crappy budget version?

the worst

I was kind of expecting an answer like that. 🤣 Honestly, WTF is it with OEMs and crappy GPUs that are cut down versions of other well-known cards? 😜

It's about satisfying a market and saving money. The OEM's might have a lot of customers interested in casual gaming but not so many hardcore gamers so they won't be putting a $600 video card in computers sold to them because they would be overkill and nobody would buy it. They need a cheap card with some competency at 3D games to fill that niche at a price that people are willing to pay so cut down versions of retail chipsets that are either intentionally disabled at the factory or that are defective in some way are used.

The manufacturer may also use a crippled memory controller to keep production costs down. A 64-bit controller requires fewer traces on the circuit board than a 128 or 256 bit one so the cards can be made smaller and are cheaper to make.

Also, a cut down chipset frequently doesn't run as hot as it's full blown cousins so exotic cooling solutions usually aren't required which saves even more money.

Reply 12 of 14, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie
sliderider wrote:
It's about satisfying a market and saving money. The OEM's might have a lot of customers interested in casual gaming but not so […]
Show full quote
mr_bigmouth_502 wrote:
Putas wrote:

the worst

I was kind of expecting an answer like that. 🤣 Honestly, WTF is it with OEMs and crappy GPUs that are cut down versions of other well-known cards? 😜

It's about satisfying a market and saving money. The OEM's might have a lot of customers interested in casual gaming but not so many hardcore gamers so they won't be putting a $600 video card in computers sold to them because they would be overkill and nobody would buy it. They need a cheap card with some competency at 3D games to fill that niche at a price that people are willing to pay so cut down versions of retail chipsets that are either intentionally disabled at the factory or that are defective in some way are used.

The manufacturer may also use a crippled memory controller to keep production costs down. A 64-bit controller requires fewer traces on the circuit board than a 128 or 256 bit one so the cards can be made smaller and are cheaper to make.

Also, a cut down chipset frequently doesn't run as hot as it's full blown cousins so exotic cooling solutions usually aren't required which saves even more money.

Ah, I see. 😁 Another thing I wonder about is, why are so many "premium" OEM systems sold with good CPUs, tons of RAM, oodles of storage capacity, but crappy GPUs? I can understand putting a crappy GPU into a cheap mainstream system, but why do the same for an otherwise kickass box that sports say a Core i7 and 16GB of ram? 😜

Reply 13 of 14, by sliderider

User metadata
Rank l33t++
Rank
l33t++
mr_bigmouth_502 wrote:
sliderider wrote:
It's about satisfying a market and saving money. The OEM's might have a lot of customers interested in casual gaming but not so […]
Show full quote
mr_bigmouth_502 wrote:

I was kind of expecting an answer like that. 🤣 Honestly, WTF is it with OEMs and crappy GPUs that are cut down versions of other well-known cards? 😜

It's about satisfying a market and saving money. The OEM's might have a lot of customers interested in casual gaming but not so many hardcore gamers so they won't be putting a $600 video card in computers sold to them because they would be overkill and nobody would buy it. They need a cheap card with some competency at 3D games to fill that niche at a price that people are willing to pay so cut down versions of retail chipsets that are either intentionally disabled at the factory or that are defective in some way are used.

The manufacturer may also use a crippled memory controller to keep production costs down. A 64-bit controller requires fewer traces on the circuit board than a 128 or 256 bit one so the cards can be made smaller and are cheaper to make.

Also, a cut down chipset frequently doesn't run as hot as it's full blown cousins so exotic cooling solutions usually aren't required which saves even more money.

Ah, I see. 😁 Another thing I wonder about is, why are so many "premium" OEM systems sold with good CPUs, tons of RAM, oodles of storage capacity, but crappy GPUs? I can understand putting a crappy GPU into a cheap mainstream system, but why do the same for an otherwise kickass box that sports say a Core i7 and 16GB of ram? 😜

For the same reason. Money. Usually when the OEM has a configurator on their website, there are several options to choose from. It may actually work out to your advantage to buy the machine in the standard configuration then upgrade later because some OEM's *cough* Apple *cough* charge a ton of money for factory installed upgrades. I see Macbooks and Mac Mini's on ebay all the time with copious amounts of RAM and massive hard drives installed by the seller that would cost you way more if you bought them that way from Apple. It pays to shop around for prices on the parts you might be intereted in upgrading later rather than just checking the boxes on the website. The OEM's know that a lot of people will load up a machine on the website without questioning the price even if it is higher than normal retail for those parts.

Last edited by sliderider on 2013-04-30, 19:33. Edited 1 time in total.