VOGONS


Geforce FX Thread

Topic actions

Reply 40 of 259, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

Well the statement I made is very subjective and depends on resolution, AA setting or HDR and what frames you find acceptable.

There where parts of the game where even a much later Core 2 Duo with a 7900GT would start to sweat and the same goes for Doom 3.

Doom 3 for me was unplayable on a Radeon 9800. I did play Farcry, but had to play at XGA (instead of 1600 x 1200) and most details set to medium. No AA or HDR.

Not sure when the game was released, but when I played it, a work colleague had just bought a Geforce 6800GT for his Athlon FX rigg.

My brother played Farcry on a ti4200 in DX8 and 800 x 600 and he would also say that the game ran perfect. So it all comes down to the settings, so my bad for not being more specific.

Anyway tomshardware has all the old reviews on file, if anyone is BING them 🤣

Reply 41 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

You thought Doom3 was unplayable on a 9800? Man I played through that game on a Radeon 9600 and a 9700 Pro. 😀 A friend of mine played it on a GF4!

I didn't play it at 1600x1200 though. More like 1280x800 or 1024x768.

This is my fav Doom3 video card comparison
http://techreport.com/articles.x/7200/1

Reply 42 of 259, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

I had a P4 northwood 2.6, 2 GB DDR, Radeon 9800 and a 40Kg 21" Compaq CRT.

All the DX8 games, FLEW on my machine. Medal of Honour, Call of Duty, Splinter Cell, RTCW. 1600 x 1200, everything max, +60 fps.

And then Farcry and Doom 3 came out and it was lag fest...

I played both games much much later on my Core 2 Duo with a 7900GT.

I still do a similar thing. Buying cheap games on Steam sales for under 10 bucks and enjoy smooth gameplay. It's also good for getting patches and driver updates.

PS: Doom 3 on Ultra texture quality uses 512MB vram. My 9800 had only 128 I believe and only the 9800XT DDR2 came with 256 (if my memory serves me right).

Reply 43 of 259, by leileilol

User metadata
Rank l33t++
Rank
l33t++
Mau1wurf1977 wrote:

All the DX8 games, FLEW on my machine. Medal of Honour, Call of Duty, Splinter Cell, RTCW. 1600 x 1200, everything max, +60 fps.

Only Splinter Cell on that list is a DX8 game (the rest are standard, DX7-ish level OpenGL relying heavily on HW T&L and compressed textures, and they also fly fast on a Geforce256), and that one was hard to get running at 60fps anyway. It was a benchmarking game that no one got more than 25fps at the time at that res and took a couple Moore's to reach that point 😉

apsosig.png
long live PCem

Reply 44 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Mau1wurf1977 wrote:

PS: Doom 3 on Ultra texture quality uses 512MB vram. My 9800 had only 128 I believe and only the 9800XT DDR2 came with 256 (if my memory serves me right).

Ultra is kinda lame tho because the biggest change is that they turn off texture compression. Maybe there are a few artifacts gone but it's really hard to notice.

Reply 45 of 259, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++
leileilol wrote:

and that one was hard to get running at 60fps anyway.

Which one?

I swear I played these at 1600 x 1200 with super smooth frames.

swaaye wrote:

Ultra is kinda lame tho because the biggest change is that they turn off texture compression. Maybe there are a few artifacts gone but it's really hard to notice.

Lame or not, the options is there and I want to max it out. There are areas in the game, when you fire the plasma gun, you get lag even on a Core 2 Duo with a 7900GT!

If I didn't care for these settings I would just buy an Xbox360 🤣

Reply 46 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

What's cool is the plasma gun mod that gets the pulses to illuminate areas. 😁 That is a real eye opener and gets the adrenaline going 🤣

Reply 47 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

leileilol do you believe that Doom3's shader programs are written primarily with NV30's architecture in mind?

Last edited by swaaye on 2011-02-12, 02:17. Edited 2 times in total.

Reply 49 of 259, by leileilol

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

leileilol do you believe that Doom3's shader programs are written primarily with NV30's architecture in mind?

No, because it looks fine on NV15 as well, sans minor 'heat' and glass refraction effects 😀

I'm not even sure if Doom3 uses shader programs all that much. Quake IV had more shader use than it definitely though and that was unplayable on FX.

Mau1wurf1977 wrote:

Which one?

Splinter Cell's the slow one. It'd better be slow as it was rare for 2002-2003 games to even TOUCH pixel shader 1.x!

apsosig.png
long live PCem

Reply 50 of 259, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

Hmm that's weird. Maybe I ran it at a lower setting...

I played it ~ 2 years ago for a second time and there are huge issues with shadows. Apparently only video cards and drivers from that time can display it correctly. Same goes for Pandora.

There are levels where you don't see certain lights and shows, making the game quite tricky.

Also apparently the Xbox versions are the best ones, as that is the main version and the PC versions are ports.

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 51 of 259, by TheLazy1

User metadata
Rank Member
Rank
Member

A year or so ago I bought an FX5200 for S-Video out to a TV, unfortunately overlay video was horribly broken.
Every other scanline was distorted but was fine in pure software modes which for an underpowered PC were a deal breaker.

Reply 52 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

On that note, I was going to use a horrible FX 5200 64-bit Edition as a 2D DVI card for my parents in their old Dell box. I got it all hooked up and then found that it can't handle 1680x1050 on DVI. I tried two different driver revisions but it simply would not output that resolution on DVI. Web searches verified it.

I ended up using an older Radeon 7500.

Reply 53 of 259, by sliderider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

Anyone here have a 5800?

Going by the specs alone, it looks like a GT430 is faster than a FX5800 Ultra. That's pretty sad. 😲

The GT120/220 OEM boards look like they are even pretty close to FX5800U performance on paper. 😮

(This post is just to give younger viewers a more modern point of reference to see where FX5800 performance sits)

Reply 54 of 259, by Iris030380

User metadata
Rank Member
Rank
Member

Of course it is ... the cards are almost 10 years apart!!

I know the Radeon 9700pro and 9800 were fast, but the FX5950 Ultras were quicker in most DX8 and OpenGl games. And yeah Doom3 was playable on a 9800 in 1024 - but it wasn't smooth. Probably lost about 6-10fps to a 5950.

It's all relative. I play Farcry on my Ti4800se and 5600 Ultra and was quite pleased with the game - but I never own top end hardware and am happy to sacrifice graphics to get fps. I'm a gamer after all, not a techie who benchmarks all day like some of my friends.

I agree with what someone said earlier - about the old days when a next gen card was actually worth upgrading to. The leaps were good for a few years back then, and the GF4 Ti series would see you right for 3 years if not more. Things changed after the GF7 series was released, and I lost interest.

Reply 55 of 259, by F2bnp

User metadata
Rank l33t
Rank
l33t

Of course they are, there is a gap of 7 years from those cards.

Reply 56 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++
sliderider wrote:

Going by the specs alone, it looks like a GT430 is faster than a FX5800 Ultra. That's pretty sad. 😲

A Radeon 9600 is faster in just about any DX9 game and it will also output better quality because of all the nasty image quality hacks that NV pulled off back then. Wasn't that just a little embarrassing for them? 😉

Last edited by swaaye on 2011-02-12, 18:44. Edited 1 time in total.

Reply 57 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Iris030380 wrote:

I know the Radeon 9700pro and 9800 were fast, but the FX5950 Ultras were quicker in most DX8 and OpenGl games. And yeah Doom3 was playable on a 9800 in 1024 - but it wasn't smooth. Probably lost about 6-10fps to a 5950.

Well Doom3 was definitely smooth on a 9700/9800 depending on the resolution. I think I played it at 1024 or 1280. ATI improved performance over time as well but 5950 maybe have always been faster than R3x0.
http://techreport.com/articles.x/7200/5

Doom3's rendering style breaks some of ATI's efficiency hardware. The NV cards were really well tailored for how Doom3 works. I think NV thought Doom3's tech was the future more than ATI did. But I don't think NV was right 🤣
http://alt.3dcenter.org/artikel/2004/07-30_english.php

GeForce FX are such strange cards. You really have to wonder what they were thinking. They suck so much for DirectX 9 that it is truly a curiosity... What was more amazing was how much NV40 differed and how it fixed every weakness.

Reply 58 of 259, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

This should bring back some memories:

VGA Charts IV: AGP Graphics Cards

http://www.tomshardware.com/reviews/vga-charts-iv,893.html

Back in the Doom 3 days I remember the Athlon 64 and a 6600GT was an excellent combination for Doom3. It was also quite affordable.

Looking at the Doom 3 Benchmark, at 1600 x 1200 with all the settings cranked, a 6800 Ultra is 4x as fast as a 9800 pro and that's also exactly how I remember the experience with my 9800. I started the game and uninstalled again. Finished the games years later on a 7900GT.

I believe for a long time Nvidia had a massive edge when it came to OpenGL games.

Farcry at 1600 x 1200 isn't smooth on any of these cards. The X800XT PE is the fastest with ~ 30 fps. And that's not even the version with the HDR patch, which IMO is the best version.

Once you play these games at 1920 x 1080 (They all support widescreen) and turn on all the settings, you need quite a decent card. For a modern XP / DX9 Retro PC I would recommend a 8800/9800GT. Or even a current GTX460,

Reply 59 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Mau1wurf1977 wrote:

I believe for a long time Nvidia had a massive edge when it came to OpenGL games.

It's not a massive edge though. It's more like a 10-20% edge and the only modern OpenGL game engine of those times was Doom3 (idTech 4 or whatever) so it's not a test of OpenGL support overall.

On the other hand, Far Cry and FEAR tended to generally run faster on a Radeon X800 than they do on a 6800 for some reason. X800 cards ran at higher clock speeds and so had a lot more fillrate so maybe that's why.

It sounds like you aren't happy unless you're getting 60 fps at 1600x1200 so you must have had a rough 3-4 years there. 😉