VOGONS


Source DirectX9 grey fog bug

Topic actions

Reply 40 of 46, by a0divided

User metadata
Rank Newbie
Rank
Newbie

This bug basically only happens while using DX9 with games that run on pre-Orangebox versions of Source, like Sin as mentioned above or HL2 & EP1 before the 2010 engine update.

But what's interesting is that this bug was fixed at some point by Nvidia, at least on their newer cards with the latest drivers. I currently have an RTX 3060 Laptop GPU with driver 516.5 and the fog actually works properly on DX9. Unfortunately, I think it's still bugged with AMD or Intel GPUs...

I'm not an expert by any means but I think the problem might be related to differences in how older (DX9) cards handle blending in comparison to modern DX10+ cards. There actually was a presentation at GDC about this for the Source Engine: https://cdn.akamai.steamstatic.com/apps/valve … heOrangeBox.pdf

Reply 41 of 46, by Dege

User metadata
Rank l33t
Rank
l33t

Thanks for that document! It's very interesting, I didn't know about the different SRGB read/write behavior of "old" and new (standardized DX10+) hardware.

So, it seems that early hw's might have done the SRGB linearization (reading) and gamma correction (writing) at wrong phases in the pipeline:
- for reading: after filtering at texture sampling
- for writing: before doing the actual blending operation

Both of them is incorrent in mathematical terms, and the new DX10 way is the correct one:

- for reading: right before the filtering when sampling a texture
- for writing: right before writing back the RGB value into the frame buffer

I found an NV documentation stating this:

On NVIDIA GeForce 8-class (and future) hardware, all samples used in a texture filtering operation are linearized before filtering to ensure the proper filtering is performed (older GPUs apply the gamma post-filtering). The correction applied is an IEC standard (IEC 61966-2-1) that corresponds to a gamma of roughly 2.2 and is a safe choice for nonlinear color data where the exact gamma curve is not known. Alpha values, if present, are not corrected.

Any value returned in the shader is gamma-corrected before storage in the frame buffer (or render-to-texture buffer). Furthermore, on GeForce 8-class and later hardware, if blending is enabled, the previously stored value is converted back to linear before blending and the result of the blend is gamma-corrected

In D3D9Ex, there is a cap bit (D3DPMISCCAPS_POSTBLENDSRGBCONVERT) indicating the correct DX10 type behavior which isn't yet exposed by dgVoodoo but it's all for the same, it won't help.
I looked again at Sin Episodes and ran it with disabled mipmapping with forced point filtering. This makes the old and new sampling process equivalent.
Also, when the "wrong" textures are drawn then blending is disabled meaning the old and new SRGB writing method are also equivalent.
Nothing changed in the game visuals so I think not this difference is our problem here. Unfortunately, for blended cases it could be a problem. Alex Vlachos proposes 2 solutions to handle the difference, one of them is:

Detect DX10 behavior and simulate the sRGB write in shader code forcing gamma blending

TBH I don't know what he thought of here. The blending operation cannot be emulated from shader so dgVoodoo cannot emulate the old sRGB behavior for blending. I don't know if it's a problem in practice, but anyway, HL2 is mentioned in the docs.

For all this problem, I have only one idea remaining. There is a difference between D3D8 and 9 DDI in the aspect of handling the fog coordinate. The early D3D9 NV drivers might have done it in the old D3D8-way and might have also had a bug for handling the fog state. And later when those bugs were fixed, the game visuals went wrong (like the thing with shadow buffers on the GF4 and Splinter Cell1) and this old driver behavior is "emulated" by nowadays on NV hw based on a game profile or sg like that. But what about other vendors? How these games worked on AMD cards?
Anyway, unfortunately I cannot check it out ATM if I'm right, it'd need some code development. And even if my theory proves to be true, I don't know how to solve it through dgVoodoo. I'm not happy about introducing a new option for that.

Reply 42 of 46, by Bubblesix

User metadata
Rank Newbie
Rank
Newbie
Dege wrote on 2022-07-26, 17:17:

TBH I don't know what he thought of here. The blending operation cannot be emulated from shader so dgVoodoo cannot emulate the old sRGB behavior for blending. I don't know if it's a problem in practice, but anyway, HL2 is mentioned in the docs.

For all this problem, I have only one idea remaining. There is a difference between D3D8 and 9 DDI in the aspect of handling the fog coordinate. The early D3D9 NV drivers might have done it in the old D3D8-way and might have also had a bug for handling the fog state. And later when those bugs were fixed, the game visuals went wrong (like the thing with shadow buffers on the GF4 and Splinter Cell1) and this old driver behavior is "emulated" by nowadays on NV hw based on a game profile or sg like that. But what about other vendors? How these games worked on AMD cards?
Anyway, unfortunately I cannot check it out ATM if I'm right, it'd need some code development. And even if my theory proves to be true, I don't know how to solve it through dgVoodoo. I'm not happy about introducing a new option for that.

I can't really comprehend the issue, but I want to confirm whether you noticed that the GeForce profile in dgvoodoo already does something to lessen the effect of the bug, as I've shown in the SiN screenshots above. Maybe that will provide a lead?

Reply 44 of 46, by Bubblesix

User metadata
Rank Newbie
Rank
Newbie

Any updates on this? Only asking because dgvoodoo is the only way I can play old source games bug-free, except for the fog. And like I've mentioned, setting an nvidia profile already does something to lessen it.

Reply 45 of 46, by Dege

User metadata
Rank l33t
Rank
l33t

Sadly, no. My current theory about the problem is the "point of linearization" for sRGB rendertargets, namely, when pixel shader output gets linearized: before or after alpha blending?
I guess NV can choose any of them internally based on the game profile but it cannot be controlled through DX11/12. (It was not defined in DX9, a cap bit told which one of the case is.)

Reply 46 of 46, by Bubblesix

User metadata
Rank Newbie
Rank
Newbie
Dege wrote on 2023-02-21, 17:44:

Sadly, no. My current theory about the problem is the "point of linearization" for sRGB rendertargets, namely, when pixel shader output gets linearized: before or after alpha blending?
I guess NV can choose any of them internally based on the game profile but it cannot be controlled through DX11/12. (It was not defined in DX9, a cap bit told which one of the case is.)

But what would be the mechanics of Nvidia vendor ID greatly reducing the severity of pixel bleaching? That would mean that the game is doing something different based on the vendor ID, correct?