VOGONS


First post, by watson

User metadata
Rank Member
Rank
Member

I want to know why Crysis doesn't run on GeForce FX cards, but runs on R300.
Wikipedia tells me R300 supports PS 2.0, and FX supports PS 2.0a, which supposedly has more features: https://en.wikipedia.org/wiki/High-Level_Shad … ader_comparison

However, somebody must be lying here because this is what Crysis reports on an FX 5200:

Unsupported video card! - NVIDIA GeForce FX 5200 (vendor = 0x10de, device = 0x0322) - Video memory: 123 MB - Minimum SM 2.0 supp […]
Show full quote

Unsupported video card!
- NVIDIA GeForce FX 5200 (vendor = 0x10de, device = 0x0322)
- Video memory: 123 MB
- Minimum SM 2.0 support: no
- Rating: 1

Obviously I'm not trying to play the game on this card, I just want to know the technical explanation, i.e. R300 has feature X which the FX series doesn't.
We all know how bad these cards are with DX9 games, but it should at least start up.
Does it even truly support PS 2.0 then or did Crytek simply blacklist the FX series because it's so bad?

I'm sorry if this has been discussed in another topic.
There do seem to be several videos on Youtube, which I presume to be fake.

Reply 1 of 13, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie

most FX 5200 had SM2.0 support removed, the earlier ones I think had it,

in any case even a 5950Ultra is going to be very slow on Crysis, probably massively slower than a 9500NP

SM 2.0 was basically made following the R300 specs very closely and it was kind of too late to adapt the NV3x to it.

Reply 2 of 13, by watson

User metadata
Rank Member
Rank
Member

I just tried it with an FX 5600:

Unsupported video card! - NVIDIA GeForce FX 5600 (vendor = 0x10de, device = 0x0312) - Video memory: 118 MB - Minimum SM 2.0 supp […]
Show full quote

Unsupported video card!
- NVIDIA GeForce FX 5600 (vendor = 0x10de, device = 0x0312)
- Video memory: 118 MB
- Minimum SM 2.0 support: yes
- Rating: -1
User chose to continue!

Of course, it crashes to desktop.

FX 5700LE:

Unsupported video card! - NVIDIA GeForce FX 5700LE (vendor = 0x10de, device = 0x0343) - Video memory: 246 MB - Minimum SM 2.0 su […]
Show full quote

Unsupported video card!
- NVIDIA GeForce FX 5700LE (vendor = 0x10de, device = 0x0343)
- Video memory: 246 MB
- Minimum SM 2.0 support: yes
- Rating: -1
User chose to continue!

It also crashes to desktop after that.

I also tried an FX 5900XT. Interestingly, there is no "unsupported card" message and nothing in the log. It just tries to start and crashes.
I've tried both the demo version and GOG version.

It runs "like a dream" on a 6600GT on the same system:

6600GT.PNG
Filename
6600GT.PNG
File size
1.31 MiB
Views
1092 views
File license
Fair use/fair dealing exception

Reply 5 of 13, by watson

User metadata
Rank Member
Rank
Member
agent_x007 wrote:

Driver path for FX series, can't handle shaders from Crysis at any reasonable rate.

leileilol wrote:

GFFX's SM2.0 implementation is so slow and bad it's better off not happening at all. Crysis made the right call here.

So you think it was deliberately disabled by Crytek? Why does it let you "continue" after the unsupported hardware message then?
Would it be possible to analyze this with some kind of D3D interception tool (apitrace?) and try to disable the offending API calls?

I guess I'll try something tomorrow. It's been bothering me for too long.

Reply 6 of 13, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Try Oblivion. Or HL2 with -dxlevel 90 in the command line. Both a fun time with an FX card. With the often noisy VRM on the FX 59xx cards you can even hear the card thinking hard to make each and every frame. 🤣

Doom 3 on the other hand runs pretty well. FX cards will still use the full precision ARB 2.0 path. Seems like a best case for pretty graphics from FX cards.

Reply 7 of 13, by leileilol

User metadata
Rank l33t++
Rank
l33t++

You know what's a little insultingly ironic about this? The GeforceFX's pixel shader 2 model is a higher version than the supported R300 cards. It's only "better" on paper and on whatever any algorithmic computer hardware aggregation site misinforms you.

apsosig.png
long live PCem

Reply 8 of 13, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

To say the fx cards have SM2 support is a overstatement at best.

It does, but only Doom 3 and Quake 4 adressed FX series quirks with SM2 more or less properly.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 10 of 13, by watson

User metadata
Rank Member
Rank
Member
leileilol wrote:

You know what's a little insultingly ironic about this? The GeforceFX's pixel shader 2 model is a higher version than the supported R300 cards.

Yes, that's exactly what's bothering me. It would mean Nvidia lied about proper DX 9.0 support (terrible performance aside).

I just wasted over two hours trying to get this thing to run.

Apitrace wasn't of much help, there was a single line in the log and it's about Direct3DCreate9 failure.
I tried 3D Analyze with countless combinations of settings, but nothing worked. It should support circumventing SM 2.0, but it's from 2004 so I can't really blame it.
Funnily enough, SwiftShader actually works, but that's 100% software emulation so it does nothing to fix our problem.
I also messed around with DxWnd (forcing software vertex shader and some other things), but nothing.
Finally, I changed all references to crysis.exe inside the executable using HxD in case it was blacklisted by Nvidia drivers. You guessed it, didn't work.

I guess I'm done with this wild goose chase for the time being. The only other thing one could try is installing various driver versions (I used the latest driver for FX).

I also tried out HL2 with DX9 on the FX 5600 (720p, lowest) and it's a slideshow. 5900XT would probably be somewhat playable at these settings, but any sane person would just stick with DX8.

Reply 11 of 13, by swaaye

User metadata
Rank l33t++
Rank
l33t++

A few weeks ago Google found me a fun PDF dated September 2018 called "nVidia Hardware Documentation" while looking around for info on NV2A video acceleration. There's some interesting info about all of the NVidia architectures.

https://media.readthedocs.org/pdf/envytools/l … t/envytools.pdf

NV34 / 5200 is a curious chip. On the surface it sounds very similar to NV36 / 5600. But it actually seems to be somewhat based on GF4 MX.

watson wrote:

I also tried out HL2 with DX9 on the FX 5600 (720p, lowest) and it's a slideshow. 5900XT would probably be somewhat playable at these settings, but any sane person would just stick with DX8.

https://www.youtube.com/watch?v=KwUxxLzu-uk 🤣

Reply 12 of 13, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:
A few weeks ago Google found me a fun PDF dated September 2018 called "nVidia Hardware Documentation" while looking around for i […]
Show full quote

A few weeks ago Google found me a fun PDF dated September 2018 called "nVidia Hardware Documentation" while looking around for info on NV2A video acceleration. There's some interesting info about all of the NVidia architectures.

https://media.readthedocs.org/pdf/envytools/l … t/envytools.pdf

NV34 / 5200 is a curious chip. On the surface it sounds very similar to NV36 / 5600. But it actually seems to be somewhat based on GF4 MX.

watson wrote:

I also tried out HL2 with DX9 on the FX 5600 (720p, lowest) and it's a slideshow. 5900XT would probably be somewhat playable at these settings, but any sane person would just stick with DX8.

https://www.youtube.com/watch?v=KwUxxLzu-uk 🤣

Anandtech mentions this
"In fact, the architecture of the NV34 is, according to NVIDIA, virtually identical to the NV31; the only official differences being the memory controller and clock speeds.

The NV34 128-bit memory controller is devoid of any of the NV30/31's compression algorithms, mostly to keep die size down on the larger 0.15-micron process"

https://www.anandtech.com/show/1080/3

Reply 13 of 13, by watson

User metadata
Rank Member
Rank
Member

NVIDIA wasn't kidding when they said that they would bring DirectX 9 support to their entire line of GPUs, and they have delivered.

ATI has no competition to the 5200 series, simply because their Radeon 9200/9100/9000 do not offer DirectX 9 support, they are based on a DirectX 8 class of GPUs. Granted that there won't be any use of DirectX 9 specific features in games for quite a while, but the tables have definitely turned; ATI is now the one that isn't delivering all of the features possible to the entry-level, and NVIDIA is clearly learning from their own past mistakes.

All this sounds very funny in retrospect. And it's written by Anand himself.
A paperweight could be considered competition to the FX 5200.