First post, by lizard78
- Rank
- Newbie
Recently I was working on a DirectX 5 backend for my retro project and I encountered the same inconsistent table fog a lot of late 90s titles suffered from. There are extensive threads on here already discussing which driver works and what doesn't, but I didn't see anyone clearly explain why this happens. I'll give you a couple examples from my owning testing on my own project:
3dfx cards - worked fine, no surprises there.
ATI cards - worked fine, with later drivers.
S3 Savage IX8 - No fog, no surprise there right? It's a mobile chipset it probably doesn't support it
Well, then I checked the device caps, and it actually claimed to support table based fog (D3DPRASTERCAPS_FOGTABLE). I also checked it under wine d3d - same problem, no fog but also reporting fog table support. Just want to clear up before moving on I'm not talking about range or vertex based fog support at all here, this is only table fog.
At that point I figured I had to be doing something wrong, so I did some digging and found what I believe to be the answer. Behind the scenes DirectX supported both Z and W based table fog, but the Z based modes were only used in software rendering originally. With the Z based fog it expects FOGSTART / FOGEND to be in a normalized range (0 - 1) but W based table fog should be specified in world units. Due to the lack of the Z based fog in hardware at the time, if the device reported fog table support you could just assume it was W based and "get away with it". If you provide world units and the driver is assuming z based table fog, however, you'll get the all too familiar no fog look.
I eventually figured out based off a bit of testing that the driver was trying to do a Z based fog in this normalized range. Eventually I stumbled upon this old Nvidia document. Turns out for the correct W fogging in Direct3D (side note, this isn't required at all in Glide since it only uses W fog) you need to provide a perspective matrix so the driver can optionally perform Z based table fogging. Essentially it just checks if the last column of the matrix is affine and if it is it assumes Z based fogging (unless the driver does w based table fog).
So the bottom line I guess is that most of the problematic games likely aren't setting the perspective matrix in the correct way as it is very poorly documented (In fact the Directx5 sdk makes zero mention of it). They "just work" on drivers that natively use W based table fog or force it, but break on ones that require the perspective matrix to perform z based table fog. When I added the perspective matrix the Nvidia document calls for fog just started working everywhere, including the S3 card. I'm sure others who have written wrappers etc have bumped into this but I found it interesting to discover some of this on my own. Anyone else looked into this at all? I'm sure what I wrote here isn't the full story.