VOGONS


Geforce FX Thread

Topic actions

Reply 220 of 259, by sliderider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:
Yup I went from an 8500 to a 9700. […]
Show full quote

Yup I went from an 8500 to a 9700.

I knew some guys who upgraded from a Ti 4400 to the lousy 5600 Ultra only to find their speed the same or worse. Blind NVIDIA faith.

I think the whole FX range was successful though. A lot of people wouldn't even consider ATI back then. People got burned by their horrible drivers on older cards including 8500 and this became very negative word of mouth.

And some people saw ATI losing in GL for various reasons. The FX cards do deliver for a lot of the GL games back then. ATI was trouble, particularly with Bioware games,

I had no trouble with any of the Baldur's Gates, Icewind Dale or Planescape:Torment with my 9500.

Reply 221 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Those aren't 3D games. The Bioware games that I'm referring to are KOTOR and NWN. I should have been specific. For that matter you could look up Doom3 because 9700/9800 didn't do very well there, and people used to swear by NV for Counterstrike. All OpenGL games.

OpenGL is to ATI of 1997-2004 what pixel shader 2.0 is to the FX series. 😉 With KOTOR the 8500-9250 can't even run the same effects as a GeForce 3! I just discovered that the other day. No soft shadows or frame buffer effects. And the poor Radeon / Radeon 7500 don't run the game correctly whereas GeForce 1/2 does.

Reply 223 of 259, by sliderider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

Those aren't 3D games. The Bioware games that I'm referring to are KOTOR and NWN. I should have been specific. For that matter you could look up Doom3 because 9700/9800 didn't do very well there, and people used to swear by NV for Counterstrike. All OpenGL games.

OpenGL is to ATI of 1997-2004 what pixel shader 2.0 is to the FX series. 😉 With KOTOR the 8500-9250 can't even run the same effects as a GeForce 3! I just discovered that the other day. No soft shadows or frame buffer effects. And the poor Radeon / Radeon 7500 don't run the game correctly whereas GeForce 1/2 does.

Here's something about Doom3

http://www.hardwareanalysis.com/content/topic/30683/?o=20

Apparently ATi had a setting in the Catalyst control panel that you had to turn off. It had to do with later cards supported by that version of the control panel having a temperature sensor and if you tried to use it with a card that didn't have one it caused problems. It was on by default so you had to go in and turn it off manually before playing Doom3.

Reply 224 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Well what I do know about Doom3 and ATI:

-NV35 and later have a double z fillrate mode that is perfect for Doom3. PR called it "Ultrashadow".
-ATI's occlusion culling mechanisms didn't entirely work = big loss of efficiency.
-NV's OpenGL driver was simply better quality back then.
-Doom3 is light on complex pixel shaders = NV3x didn't implode.

Analysis of why X800 couldn't compete with 6800 in Doom3. Relates to 9500-9800 too.
http://alt.3dcenter.org/artikel/2004/07-30_english.php

But I'm convinced that with NWN and KOTOR that the problems were caused by negligence by both Bioware and ATI. ATI didn't care to put the effort in to make their cards work really well with the games and Bioware didn't care to design their game for ATI's cards. These games seem to be pretty much targetting NV's OpenGL features.

ATI was ok with Quake 3 powered games though.

Reply 225 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Get this. I decided to try KOTOR on my Radeon 9600 and ATI has actually broken the game on their final driver pack (Cat 10.2). While it loads, the image is all messed up. Doom3 worked so it wasn't a general OpenGL problem.

I remember Catalyst 4.2 being a preferred version for KOTOR. A guy did some testing way back and that one worked the best. And sure enough the game worked again with this revision. I was hoping that after 6 more years that they had cleared up issues. I guess not! 😀

On a side note, Radeon 9600 has substantially better texture filtering than Radeon 8500.

Reply 226 of 259, by sgt76

User metadata
Rank Oldbie
Rank
Oldbie

KOTOR is a system hog! And it really needs a Radeon 9800 or better to play. I tried it once on my 4600ti and it looked and played like arse, which when conidering the age of the game and it's graphics (not that great) is quite something. Same with HALO- not very impressive graphics yet requiring an idiotic amount of horsepower compared to it's contemporaries.

Reply 227 of 259, by leileilol

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

ATI was ok with Quake 3 powered games though.

Except for the ones that use skyboxes. Nvidia forces a pixel clamp where ATI actually conforms the proper clamp mode (making artifacts). So in this case an nvidia oversight was something id software relied on as the de facto implementation here.

In other words - ATI gives black lines along skyboxes.

also, i should mention, Radeons perform best with r_primitives 2. but for some reason, it automatically chooses 1 instead (0 is auto and it'll pick the 1 mode)

apsosig.png
long live PCem

Reply 228 of 259, by sliderider

User metadata
Rank l33t++
Rank
l33t++
sgt76 wrote:

KOTOR is a system hog! And it really needs a Radeon 9800 or better to play. I tried it once on my 4600ti and it looked and played like arse, which when conidering the age of the game and it's graphics (not that great) is quite something. Same with HALO- not very impressive graphics yet requiring an idiotic amount of horsepower compared to it's contemporaries.

That's strange since it runs fine on an XBox and the graphics on that are similar to a GeForce 4Ti and a GeForce 4Ti is actually clocked faster, has more memory bandwith and a greater fill rate. Are you sure you weren't using it with a slow CPU?

Reply 229 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++
sgt76 wrote:

KOTOR is a system hog! And it really needs a Radeon 9800 or better to play. I tried it once on my 4600ti and it looked and played like arse, which when conidering the age of the game and it's graphics (not that great) is quite something. Same with HALO- not very impressive graphics yet requiring an idiotic amount of horsepower compared to it's contemporaries.

It's hard to judge because the two games use a good deal of shader effects and bump mapping. The pixels in these games have more being done to them than in other games around then. This also means that when you increase the resolution, there is more performance hit than we were used to.

sliderider wrote:

That's strange since it runs fine on an XBox and the graphics on that are similar to a GeForce 4Ti and a GeForce 4Ti is actually clocked faster, has more memory bandwith and a greater fill rate. Are you sure you weren't using it with a slow CPU?

Halo runs pretty good but KOTOR has some chop to it. Xbox's problem is the Celeron and the shared memory interface that definitely has an impact on the nice GPU in there.

Reply 230 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I've been playing with my 5900 Ultra and 6600GT again. I am continually impressed by how much faster the 6600GT is. Even in games without a lot of DX9 effects, such as KOTOR 2.

It's fascinating to me how unbalanced 5900U is. Compared to 6600GT, it has 2x the memory bandwidth and it's a 256MB card. It's obvious that 6600GT is much more efficient in every way and that the 5900U never had a chance of utilizing the memory bandwidth it was given. Even in games without the shader effects that choke it up, it lacks sufficient fillrate to use that bandwidth.

Reply 231 of 259, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

I've been playing with my 5900 Ultra and 6600GT again. I am continually impressed by how much faster the 6600GT is. Even in games without a lot of DX9 effects, such as KOTOR 2.

It's fascinating to me how unbalanced 5900U is. Compared to 6600GT, it has 2x the memory bandwidth and it's a 256MB card. It's obvious that 6600GT is much more efficient in every way and that the 5900U never had a chance of utilizing the memory bandwidth it was given. Even in games without the shader effects that choke it up, it lacks sufficient fillrate to use that bandwidth.

I came to the same conclusion. The 5900U is quite a crappy card compared to it's immediate successor.
A world of difference

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 232 of 259, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

Well those were the times of doubling performance between generations. While 5900 Ultra may have too much bandwidth, it cannot be proven by comparison with newer designs utilizing bandwidth with ever increasing efficiency.

Last edited by Putas on 2011-12-16, 11:49. Edited 1 time in total.

Reply 233 of 259, by sliderider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

I've been playing with my 5900 Ultra and 6600GT again. I am continually impressed by how much faster the 6600GT is. Even in games without a lot of DX9 effects, such as KOTOR 2.

It's fascinating to me how unbalanced 5900U is. Compared to 6600GT, it has 2x the memory bandwidth and it's a 256MB card. It's obvious that 6600GT is much more efficient in every way and that the 5900U never had a chance of utilizing the memory bandwidth it was given. Even in games without the shader effects that choke it up, it lacks sufficient fillrate to use that bandwidth.

I think it's time to retire the 5900U and get a 6800U. 😜

Reply 234 of 259, by sgt76

User metadata
Rank Oldbie
Rank
Oldbie
sliderider wrote:
sgt76 wrote:

KOTOR is a system hog! And it really needs a Radeon 9800 or better to play. I tried it once on my 4600ti and it looked and played like arse, which when conidering the age of the game and it's graphics (not that great) is quite something. Same with HALO- not very impressive graphics yet requiring an idiotic amount of horsepower compared to it's contemporaries.

That's strange since it runs fine on an XBox and the graphics on that are similar to a GeForce 4Ti and a GeForce 4Ti is actually clocked faster, has more memory bandwith and a greater fill rate. Are you sure you weren't using it with a slow CPU?

2.8ghz Northwood, so not a cpu issue. Maybe I just expect too much nowadays. My personal minimum to play a game is 1024x768 so it just may be a problem with expectations.

Reply 235 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Putas wrote:

Well those were the times of doubling performance between generations. While 5900 Ultra may have too much bandwidth, it cannot be proven by comparison with newer designs utilizing bandwidth with ever increasing efficiency.

True. I could always underclock the RAM and see what happens to performance.

sliderider wrote:

i think it's time to retire the 5900U and get a 6800U. 😜

I blew up a 6800GT last spring. I was messing with a 865PE board and I stupidly let it fall out of the AGP slot while the system was powered. All the magic smoke came out and I think a little fairy was incinerated too in the brief light show. 🙁 So getting another one of those hasn't been a priority. Been there, exploded one.

Reply 236 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I did a quick comparison of my 5900U with stock 425MHz RAM vs. 225MHz RAM.

Core 2 Duo 2.50 GHz, 775i65G, 2GB PC3200 DC, XP, GeForce FX 5900 Ultra on 93.71

UT2003
max details, 1600x1200x32

botmatch-antalus
23.248985 / 78.073921 / 134.904953 fps -- Score = 78.082634 rand[20587]
23.762781 / 78.064201 / 136.013123 fps -- Score = 78.149239 rand[20587]
16.129736 / 53.293602 / 116.512314 fps -- Score = 53.347004 rand[20587]
16.149319 / 53.268341 / 121.811752 fps -- Score = 53.318691 rand[20587]
-31.8% change

flyby-asbestos
40.571835 / 158.161346 / 771.704956 fps -- Score = 158.187927 rand[19376]
41.495853 / 158.136520 / 760.919861 fps -- Score = 158.160156 rand[19376]
27.961529 / 105.967033 / 651.829651 fps -- Score = 106.084663 rand[19376]
27.905579 / 105.804581 / 649.546875 fps -- Score = 105.922668 rand[19376]
-33.0% change


DOOM3
HQ 1024x768 demo1
(triple runs)
41.9 fps
31.0 fps

-26.0% change

So with a loss of 47% memory bandwidth, the card appears to lose about 30% of its performance in these situations.

Secondarily, in cases where 4X AA and 8X AF are enabled, the 5950U can pull ahead of 6600GT.
http://www.anandtech.com/show/1464/8
4395.png
4393.png

Reply 237 of 259, by [GPUT]Carsten

User metadata
Rank Newbie
Rank
Newbie

Doesn't change the fact but part of it could be attributed to 6600GT's measly 128 MB framebuffer.

Reply 238 of 259, by ProfessorProfessorson

User metadata
Rank Member
Rank
Member

It matters if you plan to use eye candy on games that utilize the 5900 line well. In the end though, either way, the 5900 and 6600 cards are both basically limited after a certain point, and neither will get you great performance in top tier titles released after Doom 3 or HL2. Considering the 5950 was released around Sept or Oct of 2k3, it has some excuse at least, and performs well overall for its generation, DX 7-9.0a.

As long as you don't expect the card to start smoking titles from 2005 on up, its a fine performer for a legacy card. If you're anal about the Half Life 2 performance, just use a R9700 or 9800 instead, or better (not like that game doesn't run on modern Win 7 systems anyway).

Reply 239 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

It's the shader model 2 games that absolutely murder the FX series of course. In such games you can see a 9700/9800 be several times faster. So somebody with a 9800 Pro could enjoy Oblivion whereas the 5900U was mostly done being useful long before that. 5900U even with its extra FP32 ALUs was effectively still a glorified DX8.1 card IMO.

[GPUT]Carsten wrote:

Doesn't change the fact but part of it could be attributed to 6600GT's measly 128 MB framebuffer.

I'm not sure that would be a factor with Jedi Academy but who knows. I think that game targets 32-64MB VRAM.