VOGONS


ATi GPU Thrill Thread

Topic actions

Reply 22 of 85, by Pippy P. Poopypants

User metadata
Rank Member
Rank
Member
swaaye wrote:

ATi Radeon LE 32MB DDR AGP

*snip*

I remember newegg and zipzoomfly got a ton of crap for advertising these as full-fledged 32 MB Radeon 7200 DDRs (which are clocked at 164/164 and used active cooling). Whether or not HyperZ could be fully enabled was pretty much if you got lucky or not. However, one thing to note, this was in fact an official retail product in China:

http://web.archive.org/web/20030608121648/htt … /info/20010123/ (too lazy to translate it right now, but google translate is your friend)

Supposedly it was designated to compete with GeForce2 MX cards. Initially it was erroneously reported to have a 64-bit memory interface, but in fact it was just a "normal" R100 with lower clocks and HyperZ disabled.

P.S. There was a review site that had a boxshot available, but it's long gone and the page was never archived.

GUIs and reviews of other random stuff

Вфхуи ZoPиЕ m
СФИР Et. SEPOHЖ
Chebzon фt Ymeztoix © 1959 zem

Reply 23 of 85, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yeah I remember the R100 days as being confusing and tricky. It was difficult to know just what clock speeds to expect from a specific Radeon card. OEM and retail were different. OEM versions varied amongst themselves too.

With the LE, at least it was cheap and that Anandtech article solidified just what to expect with one. GeForce 2 GTS blew R100 out of the water in most cases, and ATI's drivers sucked hard back then, so a cheap R100 was the only way to go if at all.

I did Radeon LE and then 8500 LE. They were good values I think in the end even though the drivers did suck. 😀

Reply 24 of 85, by SquallStrife

User metadata
Rank l33t
Rank
l33t
ProfessorProfessorson wrote:
http://i9.photobucket.com/albums/a66/Amakusa666/shockik7.gif /\ Pretty much how I feel about any Radeon below the 7500 line […]
Show full quote

shockik7.gif
/\
Pretty much how I feel about any Radeon below the 7500 line.

Haha I love it!:)

VogonsDrivers.com | Link | News Thread

Reply 26 of 85, by ProfessorProfessorson

User metadata
Rank Member
Rank
Member

The Radeon 7500 was actually pretty fast, and very competitive with the Geforce 2 and 4 MX line. I have ZERO complaints about that card, as it was priced right for the performance, and it delivered great gameplay.

Reply 27 of 85, by swaaye

User metadata
Rank l33t++
Rank
l33t++

7500 / RV200 is pretty nice yeah. It is basically R100 at 150nm but they did also fix some bugs. I think they fixed something with its anisotropic filtering because it has intermediate AF levels unlike R100. They also fixed the inability of R100 to use asynchronous clocks. And of course it benefits from years of driver development for R100.

Asynchronous clocks make a lot or sense because they were giving R100 a lot more bandwidth than it really could use. 7500 shifted core clock up and left RAM clock down a bit. I have a 7500 with RAM that will go from 230 to 320 MHz but it gains little performance. It's probably fillrate limited.

7500 is also very cool running. I unplugged the fan on the card and it doesn't overheat even with the tiny heatsink there.

Reply 29 of 85, by swaaye

User metadata
Rank l33t++
Rank
l33t++
leileilol wrote:

7500's slow at Doom3 though, unlike the Geforce2 😀

There's a trick to speeding it up a lot. Turn off vertex buffers with a console command. It will go from slideshow to probably about GF2 speed I guess.

r_usevertexbuffers 0

KOTOR can be sped up with a config file tweak to disable vertex buffer objects as well.

But yes NVIDIA is always the better choice for GL games.

Reply 30 of 85, by ProfessorProfessorson

User metadata
Rank Member
Rank
Member
leileilol wrote:

7500's slow at Doom3 though, unlike the Geforce2 😀

Doesn't matter, I wouldn't play Doom 3 on anything below a Geforce FX 5950 or Radeon 9700 anyway.

Reply 31 of 85, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Some of us like to try crazy things like Doom3 on a Voodoo2. Practicality is not a consideration. 😉

With Doom3, id actually built in legitimate support for NV10 and later because of the sheer number of people with NV1x boards. It's surely the most advanced usage of those chips but it was primarily a business decision to make more sales. Radeon has similar register combiner, pre-pixel shader hardware as NV1x but apparently it's not quite as flexible and R100-RV200 had nowhere near the same market penetration as NV1x so id didn't bother to cater to them.

Reply 32 of 85, by ProfessorProfessorson

User metadata
Rank Member
Rank
Member

Yeah, Ive tried Doom 3 before on a GF 2 GTS, GF3, TI 4200, and a FX 5200 for the hell of it. The GF3 and TI 4200 did ok at 800x600 with shadows off, but not ok enough to make me want to play the game that way on that hardware. It felt barely a step above the Xbox port.

Reply 33 of 85, by Pippy P. Poopypants

User metadata
Rank Member
Rank
Member

I have yet to play Doom 3 on any pre-R300 card, but from what I hear it doesn't seem to suffer from the same graphical artifacts present on NV1x/NV2x cards (most noticeable on the carry weapon models).

And the built-in NV10 support is quite impressive too, considering that Carmack was knocking the GeForce4 MX cards when they were first released. But what goes around comes around 🤣.

GUIs and reviews of other random stuff

Вфхуи ZoPиЕ m
СФИР Et. SEPOHЖ
Chebzon фt Ymeztoix © 1959 zem

Reply 34 of 85, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Pippy P. Poopypants wrote:

I have yet to play Doom 3 on any pre-R300 card, but from what I hear it doesn't seem to suffer from the same graphical artifacts present on NV1x/NV2x cards (most noticeable on the carry weapon models).

Well there's a large loss of arithmetic precision on the DirectX 8-level cards so that's perhaps the cause of the artifacts.

Pippy P. Poopypants wrote:

And the built-in NV10 support is quite impressive too, considering that Carmack was knocking the GeForce4 MX cards when they were first released. But what goes around comes around 🤣.

I'm sure they only bothered because those GF4MX cards sold incredibly well. They were in lots of OEM PCs. So in order to make more money, they supported them. Lots of game developers despised those cards because they were selling the GeForce2 features under the GeForce4 name. ATI later did something like that with the Radeon 9000-9250 because they originally said that the first number in the model indicated DirectX level.

Reply 35 of 85, by Pippy P. Poopypants

User metadata
Rank Member
Rank
Member

Yeah around the time HL2 was released Valve did a survey and the GeForce2 MX ended up being the most common card in most gamers' machines (GeForce4 MX was second if I remember). And of course, HL2 has a DX7 renderer as well. Regardless, just looking at the number is no easy affair nowadays. Always gotta do plentiful research before buying.

GUIs and reviews of other random stuff

Вфхуи ZoPиЕ m
СФИР Et. SEPOHЖ
Chebzon фt Ymeztoix © 1959 zem

Reply 36 of 85, by Tetrium

User metadata
Rank l33t++
Rank
l33t++

We need more pics here!

DSC00768.jpg

DSC00769.jpg

I'm a bit tired right now (close to 3am and already had 1 short night) and I forgot how to use thumbnails, but since it's only 2 pics I guess this will be fine this time.
The first pic are from my pile of my best AGP graphics cards (nothing fancy here really).
First row, from top to bottom:
Radeon X800, bought second hand and is untested at this time (it does post though).
Radeon 9600XT, working.
This one I've had for over 3 years me thinks. It was the second card I modded the cooler (the 1st one was one of those simple ones using 2 pushpins, as practice) with a Zalman copper flower-like cooler.
I got this one for free
Radeon 9700? Untested, but supposedly working

Second row:
Radeon 9600. Was used for years in either my main rig or in the rig of a friend of mine (I got 2 which are virtually identical and forgot which one is which). It's in working condition
Radeon 9600 I think. Can't remember how I got it 😵
Radeon 9800 XL (not a typo), got it from someone who couldn't get it to work in his rig (was a Pentium 2 😜). I removed the original tiny cooler for future replacement with some kind of large Zalman cooler.

Second pic are 3 out of 4 cards I got for virtually free. All are artifacting though, but I only wanted that Zalman cooler anyway 😜

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 38 of 85, by RichB93

User metadata
Rank Member
Rank
Member

Had a 9250 which my friend got with a prebuilt he was ripped off with (it was bought in 2007 for £180 and had a Duron 1.2GHz an old AGP 4X board!!)

Only other RADEON to talk about is the 5650 in my laptop. Little offtopic but I have a RAGE PRO TURBO 4MB and I do love my Rage Fury Maxx which provides the beef for my P3-800 rig ;D

Reply 39 of 85, by prophase_j

User metadata
Rank Member
Rank
Member

Here are some pictures of my 9800xt, that has a broken capacitor held in place with a binder clip. Also installed is a zalman cooling system.
imag0036cg.th.jpg

I used this card for a while in my KT133A system, and later used it in my 775i65g rig. In my KT133A with a Athlon XP @ 1.67ghz, I could play Farcry on medium settings, which I thought was pretty impressive. I can't find any of my old screenshots for it, but I want to say it gave me 13k in 3dmark01. It was defiantly CPU bound in this configuration, but it severed me well for over year, the computer running 24/7 for my daily use and folding proteins.

It only saw brief use in the 775i65g rig, which at the time was being driven by a E2160, and unfortunately I don't think I ever spec'd it before getting a different card. I do remember Farcry working the high settings though. I knew that the even the humble E2160 had undoubtedly unleashed power not found in any but the most exotic netburst systems, but was quickly overshadowed by my search for a fast card that eventually wound up with a HSI Radeon 3850 for my AGP slotted hotrod.

I did however, put it back in the motherboard to test it out, and get a proper reading of all of it's potential. By this time I had also moved on from the E2160 to a E5800 teamed up with some DDR500, overclocked to 3.74ghz. This setup scored 26k in 3dmark01 with the 9800. The same system gets 37k with the 3850 but at the stock 3.2ghz processor speed. Now I know that 3Dmark01 is a DX 8.1 benchmark, but still I think it is pretty impressive compared to a card that is 3 generations ahead.

200137410009800.th.png

Uploaded with ImageShack.us

"Retro Rocket"
Athlon XP-M 2200+ // Epox 8KTA3
Radeon 9800xt // Voodoo2 SLI
Diamond MX300 // SB AWE64 Gold