VOGONS


Geforce 2 GTS vs Radeon 7500

Topic actions

First post, by candle_86

User metadata
Rank l33t
Rank
l33t

So I'm trying to decide for Windows Me which one to use. System specs

Athlon 700
512mb PC100
40gb HDD

Will run Windows Me, current has the 256 but the hard drive died so I'm redoing the PC and now i have a 7500 i could use since i won't be messing up drivers, which would yall go with the 2 GTS or the faster 7500?

I'm putting the 256 into an anti static bag so it doesn't get damaged.

Reply 1 of 39, by leileilol

User metadata
Rank l33t++
Rank
l33t++

They're about a year apart, and if I had one to choose between the two here in a strictly budget mid-2001 upgrade situation, it'd be the 7500 (which is much more comparable to the GF2 Ultra and is more efficient and has fun dither options). However if I wanted to 2000 period up and absorb myself into the hype (which your specs seem to echo a bit save for the excess ram), it'd be the GF2 GTS

There's also the usual fullscreen directdraw palette stalling worries with the whole Geforce line that Radeons don't get for old games that's usually never brought up here unless it's a "dosbox is slow" thread, but back in 2000 the top concern for getting a Geforce2 wouldn't have been about older paletted Windows games and more about that sweet new FSAA tech and big fps for big resolution with a massive 64mb of VRAM, ready for Duke Nukem ForeverMax Payne, Tribes 2, Ultima Online 2, Black & White, Giants, The World Is Not Enough, Blade of Darkness, etc. and it wasn't until late 2002 that both cards' importance with HW T&L, compressed textures and cubemapping really shined with UT2003, NOLF2, BF1942 all coming about two video generations later.

apsosig.png
long live PCem

Reply 3 of 39, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Radeons support EMBM and their anisotropic filtering has much less speed hit than you will see with GF2 (or GF3,4 for that matter). You also don't have the DXT1 banding problems with a Radeon.

On the other hand you will have better game compatibility with a GeForce.

Which chipset is your Athlon motherboard using? Frankly I prefer a Voodoo card with most early Athlon chipsets because every other AGP card has some kind of stability issue with them. Once you get to nForce 2 or like KT600 then they finally are solid.

Reply 6 of 39, by pentiumspeed

User metadata
Rank l33t
Rank
l33t

This is one of many reasons I'm leery about VIA chipsets, running Athlon XP cpu. Had that Asus board that supported the 333 fsb, because i was able to run barton cpu (upgraded after running eariler XP Athlon) Yes, had VIA chipset board and was one of many problems. Remembered most was a crash a tron thing trying to finish one of Myst games back in the day, but did finished the game finally after going through few artifacts. Was not my video card giving up and running 98se, drivers was updated. Had Geforce2 MX GPU. HD3650 came out much later at tail end of 98se when I finally transitioned to XP on core 2 duo optiplex 980, HD3650 PCIe. Back in the day, I bypassed P4 using Athlon from PII 350 (was Tseng ET4000 PCI 2MB) to with Geforce2 MX to run realMyst as well. Then optiplex 780 on XP (built from cheapest priced parts, and a optiplex 760 tower shell). I still have the optiplex 780 computer but I restored when someone donated a optiplex 780 tower.

Also I do now not like 2 phase power drive that Asus used frequently on their VIA and nforce socket 462 boards, no 12V CPU support. I preferred to have 3 or 4 phase circuit with 12V supply which is way better and reliable which helps buying standard ATX PSUs. Imagine pushing 80W at 5V through 2 phase circuit is foolhardy.

Cheers,

Great Northern aka Canada.

Reply 7 of 39, by maximus

User metadata
Rank Member
Rank
Member

The GeForce2 GTS is hands down the better card. The Radeon 7500 looks good on paper, but it's crippled by poor drivers.

Check out my Windows 98 VGA charts. I tested both the GeForce2 GTS and Radeon 7500 on a Pentium III 1000 with a variety of games and benchmarks. The Radeon 7500 gets crushed by the GeForce2 GTS in the majority of the tests, despite the former having better specs.

ATI's drivers from this period are pretty atrocious. Aside from the poor performance, lots of games have rendering issues. Lack of table fog is pretty much a deal breaker for me with Windows 98 systems. ATI didn't really come into its own until the Radeon 9700 generation, which is a solid choice for either Windows 98 or Windows XP gaming.

I typically like playing around with underdog video cards. In this case, though, the Radeon doesn't really have any redeeming qualities that I can think of*. It's a shame ATI never got the drivers sorted out, as the hardware seems to be sound.

* This is includes the anisotropic filtering. It's faster than Nvidia's implementation, but the quality is very poor. The Radeon 7500 supports anisotropic filtering to about the same extent the TNT2 series supports trilinear filtering (that is to say, not really).

PCGames9505

Reply 8 of 39, by leileilol

User metadata
Rank l33t++
Rank
l33t++

The Geforce2 GTS didn't exactly have the bestest drivers at launch either... They (at least the 3D Prophet II) shipped with detonator 4.12, hadn't yet do their famous big Q3 speed boost and had the awful DXT appearance earlier (which was fixed by 12.41). and then we can go on about the later 6x.xx drivers effectively sabotaging the card 😀

apsosig.png
long live PCem

Reply 9 of 39, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
maximus wrote on 2020-04-13, 18:16:

I tested both the GeForce2 GTS and Radeon 7500 on a Pentium III 1000 with a variety of games and benchmarks.

Small variety of games, higher resolution/settings here and there wouldn't hurt either.

Reply 10 of 39, by Garrett W

User metadata
Rank Oldbie
Rank
Oldbie
maximus wrote on 2020-04-13, 18:16:
The GeForce2 GTS is hands down the better card. The Radeon 7500 looks good on paper, but it's crippled by poor drivers. […]
Show full quote

The GeForce2 GTS is hands down the better card. The Radeon 7500 looks good on paper, but it's crippled by poor drivers.

Check out my Windows 98 VGA charts. I tested both the GeForce2 GTS and Radeon 7500 on a Pentium III 1000 with a variety of games and benchmarks. The Radeon 7500 gets crushed by the GeForce2 GTS in the majority of the tests, despite the former having better specs.

ATI's drivers from this period are pretty atrocious. Aside from the poor performance, lots of games have rendering issues. Lack of table fog is pretty much a deal breaker for me with Windows 98 systems. ATI didn't really come into its own until the Radeon 9700 generation, which is a solid choice for either Windows 98 or Windows XP gaming.

I typically like playing around with underdog video cards. In this case, though, the Radeon doesn't really have any redeeming qualities that I can think of*. It's a shame ATI never got the drivers sorted out, as the hardware seems to be sound.

* This is includes the anisotropic filtering. It's faster than Nvidia's implementation, but the quality is very poor. The Radeon 7500 supports anisotropic filtering to about the same extent the TNT2 series supports trilinear filtering (that is to say, not really).

Don't want to be that guy and I know benchmarking can be tough and very time consuming process, but of the games tested you tried two OpenGL games (Quake 2 & 3), one game that's notoriously broken on anything but Software and Glide (Unreal) and requires 3rd party renderers to shine and one D3D game. Quake 3 was pretty relevant in 2000, Quake 2 less so and with the settings and resolutions you tested, the cards run it very fast. Does it really make that huge of a difference if the Radeon 7500 produces ~130FPS and the GF2 GTS ~170FPS? OpenGL never was ATi's strong suit and you can show that off really well with some later titles, but it does pretty well in Q3.
Unreal D3D is simply broken IMO and again it's an older game.
Expendable is a solid D3D benchmark I'd say and the 7500 really takes a beating here.

I agree with other people that posted here, it'd be nice if you included more games, tests with more features (such as AF) and even older drivers for some of the cards. I do generally avoid ATi cards from that era, but your results don't really reflect this. Again, I hate to be that guy and don't wish to belittle your commitment and time spent here.

Reply 11 of 39, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Unreal engine D3D might have been troublesome, but people certainly were using it with these cards. That would be UT and Unreal engine licensees though, not so much Unreal the game.

Quake 3 is an interesting OpenGL example because you are probably seeing the best ATI could muster. It was critical to have good performance with Quake games to sell cards. Other OpenGL games may not even run right because they weren't in benchmarks. Like say Bioware OpenGL games....(but that was more for 8500/9x00).

Last edited by swaaye on 2020-04-14, 16:48. Edited 2 times in total.

Reply 12 of 39, by leileilol

User metadata
Rank l33t++
Rank
l33t++

there's always the r_primitives 0 auto-setting throwing performance potential off that creates a weird nvidia bias in Q3 engine stuff, and other fun 'ati sucks at GL' things like Q3's skyboxes getting clamp seams that nVidia doesn't when it's actually correct behavior and id blindly relied on nvidia's incorrect visual output for that

apsosig.png
long live PCem

Reply 13 of 39, by Garrett W

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote on 2020-04-14, 16:31:

Unreal engine D3D might have been troublesome, but people certainly were using it with these cards. That would be UT and Unreal engine licensees though, not so much Unreal the game.

Quake 3 is an interesting OpenGL example because you are probably seeing the best ATI could muster. It was critical to have good performance with Quake games to sell cards. Other OpenGL games may not even run right because they weren't in benchmarks. Like say Bioware OpenGL games....(but that was more for 8500/9x00).

No argument there, but does it make sense to benchmark it anymore now that we have the knowledge that it is very troublesome in D3D and not really representative of what these cards are capable? Both Unreal and UT are very important, but if Glide isn't used (or the software renderer for CPU benches), I think one of the alternative renderers available online (mainly UTGLR) should be used instead as not only does it run miles better, it's also as feature-rich as the Glide renderer.

And yes, you are completely right about those Bioware games, it's what I was thinking as well. Neverwinter Nights actually makes sense to test, it's really stuff like KOTOR that would be a little too demanding anyway. You could perhaps throw Call of Duty in there as well, another very popular OpenGL (and idtech3 derived even!)game, although this too is pushing it somewhat, being a late 2003 title.

Reply 14 of 39, by maximus

User metadata
Rank Member
Rank
Member
Putas wrote on 2020-04-14, 16:04:

Small variety of games, higher resolution/settings here and there wouldn't hurt either.

Those are both valid points. For that benchmark project, I wanted the results to be repeatable, so I stuck with games and demos that have built-in performance measurement tools. Unfortunately, there aren't too many of those that are usable with the full range of DX6, DX7, DX8, and DX9 cards. AquaMark3 and Codecreatures are typically part of my benchmark suite, but the older cards can't really cope with those. There are also games like Descent 3 and Need for Speed: Porsche Unleashed that make great performance tests, but the lack of timedemo mode means there's not a great way to get numbers out of them.

I did all the testing on a 1024x768 CRT, so that accounts for the absence of higher resolution settings.

I also left anisotropic and even trilinear filtering off because the cards vary widely in how well and even if they implement these, so it wouldn't really be a fair comparison.

PCGames9505

Reply 15 of 39, by Garrett W

User metadata
Rank Oldbie
Rank
Oldbie

Thanks for explaining your methodology. By the way, Descent 3 offers a timedemo, look it up! As far as the CRT goes, yeah you are right and it is relateable to a lot of people, since that resolution is pretty common for 15"-17" CRTs. I think you have a point on AF, especially when you're checking on so many cards like you did, but I think it's important to show these things off when doing smaller benches, as they could make a huge difference.

Here are some other suggestions for games that I use sometimes, perhaps you'll like some of them:

MDK2 - OpenGL Action game released in 2000, even the demo has the built-in timedemo. It's pretty cool, if mostly CPU limited, but it can get a bit more demanding in higher resolutions and 32bit depth.

Forsaken - Classic D3D 6DOF Action game from 1998, very light-weight, it can do 60FPS even on a single Voodoo2. Probably best kept for older cards. Has two timedemos, the Nuke demo is the one that's mostly used AFAIK.

Turok 2 - D3D/Glide FPS ported from the N64, offering a pretty long timedemo. Can be pretty demanding even on later hardware. Still a little unsure of the results I've seen but definitely worth a consideration. Avoid the first title, as it has performance issues above 30FPS, so it doesn't really make for a strong case.

Dethkarz - D3D/Glide Arcade style Racing game from 1999 if I'm not mistaken. Very fun title, it has a separate shortcut for a timedemo, called Performance Test. Looks great and offers a ton of tweaking in the options menu, which can be a little daunting. Great stuff and great scaling for later systems to the point of (very) diminishing returns.

Incoming - D3D Arcade Shooter from 98(?). Very popular benchmark, the demo includes it as well. One of the most prominent games of the era to expose visual artifacts and hardware omissions for a lot of early 3D cards.

Breakneck : N.I.C.E. 2 - D3D (maybe Glide as well? I forget) Racing game from 1998. A lot less impressive compared to Dethkarz, but I thought I should mention it since it offers a nice built-in benchmark. Have not messed too much with it though.

Hope this helps!

Reply 16 of 39, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

In all fairness, Radeon 7500 should be compared to GeForce 4 MX 440.

Has two timedemos, the Nuke demo is the one that's mostly used AFAIK.

I use Ship, because it's default demo and it saves me some button presses =P

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 17 of 39, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

Depending on timedemos is exploitable, might not represent real gameplay experience and leads to so many uselessly repeated tests.

17" CRTs can often go up to 1600x1200, just define those resolutions. It might not be nice to look at, but who cares when it is for benchmarks.

Reply 18 of 39, by Garrett W

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote on 2020-04-15, 10:40:

Has two timedemos, the Nuke demo is the one that's mostly used AFAIK.

I use Ship, because it's default demo and it saves me some button presses =P

Yeah you might be right, it's been a while.

Reply 19 of 39, by maximus

User metadata
Rank Member
Rank
Member
Garrett W wrote on 2020-04-14, 23:17:

By the way, Descent 3 offers a timedemo, look it up!

Did not know that. Thanks for the tip!

Based on feedback, I'm thinking I may have to do a second round of Windows 98 VGA benchmarking 😀

Sounds like the following would be useful additions:

  • Additional game benchmarks
  • Higher resolution modes
  • Extra tests with trilinear filtering / AA / AF enabled

PCGames9505