VOGONS


Geforce FX Thread

Topic actions

Reply 240 of 259, by ProfessorProfessorson

User metadata
Rank Member
Rank
Member

Actually Oblivion raped any DX9.0B card and below (except for maybe the X800 XL and the higher ups on the X800), even at 1024x768, and that was with being paired with a FX-60, so its not likely you got enjoyable fps at 30 or above at medium or high details on that game with a R9700 or 9800. It was even giving the 6800 line a run for its money. This subject was covered quite extensively by gamespot and firingsquad back in 06. There was a few guides out there for tweaking the game, but nothing with the amount of trickery needed to run the game well on medium on a R97-9800 because it simply aint gonna happen. Game wasn't optimized well anyway.

Reply 241 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

It does depend on your frame rate tolerance. I find 20-25 fps adequate for Oblivion. A 9700/9800 can deliver on medium at 1024x768. Although it is also necessary to have a fairly powerful CPU, such as a mid/top Athlon 64, and a box with a 9800 probably lacks this.

Oblivion is a game I like to experiment with. I've run almost every card with it. Even some DX8 cards via Oldblivion.

GF6 cards run considerably faster if you go with bloom instead of HDR. This is also how X850 and older run the game. A 6600GT gets a nice boost.

Reply 242 of 259, by [GPUT]Carsten

User metadata
Rank Newbie
Rank
Newbie
[GPUT]Carsten wrote:

Doesn't change the fact but part of it could be attributed to 6600GT's measly 128 MB framebuffer.

I'm not sure that would be a factor with Jedi Academy but who knows. I think that game targets 32-64MB VRAM.[/quote]
The nosedive Radeon 9700 takes is another indicator - compared to the 1/3rd faster clocked 9800 XT it should not be only 1/3rd as fast. 😀

What helps 6600 GT compared to 9700 Pro here is PCI Express, which allows for faster texture uploading and bi-directional transfers. I am a bit sceptical here, because apart from heavy use of shader model 2.x, the FX line also took a comparatively large performance hit from enabling anisotropic filtering.

Reply 243 of 259, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

Wasn't Oblivion first with dx9 requirement? In 2006 FX line was ending it's lifetime. I still think discarding it as unusable is too harsh. Sure there were effects which could make it crawl but there was always option to lower the shader model and pump up something else instead. Often reasonable trade off.

Reply 244 of 259, by sliderider

User metadata
Rank l33t++
Rank
l33t++
Putas wrote:

Wasn't Oblivion first with dx9 requirement? In 2006 FX line was ending it's lifetime. I still think discarding it as unusable is too harsh. Sure there were effects which could make it crawl but there was always option to lower the shader model and pump up something else instead. Often reasonable trade off.

As I understand it, the Very Low settings weren't available from the beginning. They weren't added until the 1.1.511 patch which was released a month after the first version of Oldblivion. My guess is Bethesda saw how many people were using that to play on unsupported hardware and they realized that they underestimated the number of people still using DX8 and early DX9 hardware when they set their minimum requirements so they patched it to run an acceptable framerate on that hardware to increase sales.

Reply 245 of 259, by F2bnp

User metadata
Rank l33t
Rank
l33t

Yep, that's exactly what happened. I used to own a Celeron 2.4 and a GeForce 5600XT at the time Oblivion came out and it was completely unplayable on my PC at any settings. Oldblivion really helped out the first month or so, then Bethesda released the patch and Very Low worked just fine.

Reply 246 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Oldblivion looks a lot better than Ultra Low. The rewritten shader programs worked out pretty well.

Check it out. Ultra low comparison included.
http://www.oldblivion.com/?page=screenshots
http://www.oldblivion.com/sm/index.php?board=8.0

In retrospect, Bethesda probably should have supported DX8 for the FX series, slow IGPs and the RPG fans who don't keep up on hardware.

Reply 247 of 259, by F2bnp

User metadata
Rank l33t
Rank
l33t

Heh, it's been some time.
I had no idea Oldblivion looked that much better!

Reply 248 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

New acquisition. Literally brand new. eVGA GeForce FX 5200 Ultra 128MB.

259850176781166.jpg 44646f176781184.jpg

Temps were high and heatsink barely warm so I disassembled it and found that the thermal paste had separated into worthlessness. An opportunity for a GPU photo op.

Performance of this monster is like you'd expect. It's similar to a GF3 Ti or Radeon 8500, except that it will "run" D3D9 games. Doom3 is essentially playable at 640x480 medium. 😉

The fan is inaudible in 2D, but start up a 3D app and wow it gets loud. Whiny hair dryer loud. It also doesn't seem to return to 2D quiet speed.

Reply 249 of 259, by jmrydholm

User metadata
Rank Member
Rank
Member

I had a 5200 FX running in my Windows XP tower for a while. It was my "fall back" option when my NVidia 7800 GS died. I recently found my good old FX 5900 XT card my dad got me back when I was in my early 20's. The OEM fan on that broke one day, so I sandwiched it in a Thermaltake "GIANT" heatpipe monster sink thing. It hums loudly, even on the lowest setting, but it's nice and cool. I use this along with my Voodoo 2 cards for old games Windows 7 and newer DirectX drivers don't like.

I turned the turbo fan on one night when testing it and my wife said, "What the heck is that noise??" 🤣 fun stuff. I'm currently playing Thief Gold on it.

Edit: not my actual card, but it looks very similar to this now. The little fan on the edge is the noisy turbo fan.
a1919.jpg

"The height of strategy, is to attack your opponent’s strategy” -Sun Tzu
“Make your fighting stance, your everyday stance and make your everyday stance, your fighting stance.” - Musashi
SET BLASTER = A220 I5 D1 T3 P330 E620 OMG WTF BBQ

Reply 250 of 259, by Jolaes76

User metadata
Rank Oldbie
Rank
Oldbie

I am not sure whether this is mentioned in this thread or anywhere else, but here is my experience with these cards (non-Ultras)>

most of the FX cards draw a little less power overall than the GF4 Ti line. In 2D mode it is obvious (different clocks) but I remember I measured less power consumption even under full load with the FX 5500/ FX 5600 than with a GF Ti4200.

This little less was just enough to keep my older systems stable where the AGP slot v1.0 do not provide enough power for a GF4. Alternatively, I re-flashed the BIOS of a GF4 Ti4200 with lower clocks (200 / 400) to stabilize for my GA-5AX motherboard and the performance was nearly the same.

Also because mainstream FXs are shorter, lighter cards, they can be fitted more easily into heavily populated mobos. So even if DX9 features are useless, these cards are pretty good for a time machine.

"Ita in vita ut in lusu alae pessima iactura arte corrigenda est."

Reply 251 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

http://techreport.com/articles.x/5257/10

Fascinating. The 5200/NV34 has behavior here that is similar to NV17/GF4MX. I'm wondering if 5200 is more different than 5600/5800 than the advertising suggests.

Reply 252 of 259, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

IIRC 5200 did not have/used the bandwidth savings features of higher models like color compression.

Reply 253 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yeah it also has about half of the transistors of NV31/5600. With that considered, it's harder to complain about its speed and features. Maybe it is a hybrid NV17 design.

Reply 254 of 259, by ProfessorProfessorson

User metadata
Rank Member
Rank
Member

You know, right before I picked up a Radeon 9600XT I had went with a PNY FX 5600 from Best Buy or Circuit City, don't remember which. I wanted to match it with a Athlon XP1700+. I don't remember if it was a Ultra card or not, but the card was supposed to be 256mb, but on post and in Windows was reported as a 128mb card. I spent two days trying to resolve the matter with PNY with their customer support on the phone. The most of the support people were from a call center in India, and I honestly couldn't understand a thing they were saying half the time, and their reps mannerism over the phone gave me a sense they were talking down to me.

At any rate, it was becoming a huge run around where they would say, oh, try this driver, oh, try this one, which was obviously not going to fix the problem since it was reporting at post as being 128mb. I had loved PNY prior to this due to their solid ram and GF 3 card line and all, but this really burned me. Not once on the phone was I ever offered the ability to return the card to them as a RMA as I had requested. I just kept getting the run around. I ended up just taking the card back to the store and used the cash on the 9600XT. That ended up being a faster card anyway. If any past or current employees of PNY ever read this post, you clowns can suck it.

Reply 255 of 259, by swaaye

User metadata
Rank l33t++
Rank
l33t++

🤣

5600 is of course useless for most DX9 games but these FX cards make pretty nice D3D 3-8 cards and the OpenGL is excellent of course. Doom3 runs decent too because GFFX and the game engine are almost designed for each other.

The 5600 had a lot of variants. The Ultra cards usually have 128MB RAM, meaning the 256MB cards were typically budget cards with reduced clocks all around. Also, a few months in, NV released a revised NV31 chip in flip-chip packaging and added 50 MHz to the Ultra model.

Reply 256 of 259, by informatyk

User metadata
Rank Newbie
Rank
Newbie

Is it worth to upgrade from 5600XT 128MB to 5700VE 256MB? Do I notice any difference?

https://download.marpio.net IE4 compatible retro software download site

Reply 257 of 259, by ElectroSoldier

User metadata
Rank Oldbie
Rank
Oldbie

FX5200 with Windows 95b.
Great card, plays all the games I want it to without any issues, and all for less than a tenner.

Reply 258 of 259, by RandomStranger

User metadata
Rank Oldbie
Rank
Oldbie
informatyk wrote on 2023-07-21, 17:06:

Is it worth to upgrade from 5600XT 128MB to 5700VE 256MB? Do I notice any difference?

The FX5600 XT is so severely mutilated, it's actually marginally slower than a reference 128bit FX5200.

The FX5700VE is also severely mutilated and only marginally faster than the reference 128bit FX5200 (unless you get one of the lower clocked VEs)

So no, I wouldn't recommend the upgrade. Both are a joke. Just OC the 5600XT until you find something decent, like the plain FX5600 or FX5700.

sreq.png retrogamer-s.png

Reply 259 of 259, by mmx_91

User metadata
Rank Member
Rank
Member
RandomStranger wrote on 2023-07-21, 18:03:
The FX5600 XT is so severely mutilated, it's actually marginally slower than a reference 128bit FX5200. […]
Show full quote
informatyk wrote on 2023-07-21, 17:06:

Is it worth to upgrade from 5600XT 128MB to 5700VE 256MB? Do I notice any difference?

The FX5600 XT is so severely mutilated, it's actually marginally slower than a reference 128bit FX5200.

The FX5700VE is also severely mutilated and only marginally faster than the reference 128bit FX5200 (unless you get one of the lower clocked VEs)

So no, I wouldn't recommend the upgrade. Both are a joke. Just OC the 5600XT until you find something decent, like the plain FX5600 or FX5700.

I bought a FX5600XT back in the day that overclocked very well to the stock 5600 speeds. The improvement is much, most of them can do this with no risk and good temperatures.
It still works, now powering my Tualatin retro build!

However, as you said, is not a reference card for the usage we do today xD