First post, by vorob

User metadata
Rank Member

I'm here with a very specific question. 16bit games used dithering to make colors look less banding-affected, my primary example is Thief II.

For easy understanding just compare these two perfectly working Thief II screenshots.

One with dithering (on Intel 4500):

One without dithering (on 2080 via dgVoodoo):

So, dgVoodoo helped me to run Thief II natively. Single-core was enabled via cmd script. One thing is missing - dithering. How can I enable it?

Reply 1 of 2, by leileilol

User metadata
Rank l33t++

You can't enable it on most hardware as dithering got ripped out of DX10-class hardware long ago. Intel's one of the very few that still do it (also their dithering's very, very good)

It's probably possible to have some kind of dither post-process shader to work. It definitely wouldn't be like the old texture dither with the cute overdraw feedback artifacts though, more like PowerVR dither in this case.

by the way, DOSBox is not for running Windows 9x

Reply 2 of 2, by vladstamate

User metadata
Rank Oldbie

While leileilol is right in that you cannot obtain 100% the same feeling as the HW dithering in old graphics cards, you can get pretty close and maybe do it "better" (quantifying better as in more accurate for the purpose of removing banding) nowadays with shaders (at expense of performance). A simple search of Dithering on ShaderToy has good examples: https://www.shadertoy.com/results?query=tag%3Ddithering. I'm also thinking you can do some run-time noise based variable pattern dithering for improved effects if you have the cycles. Or use an already existing algorithm like Floyd–Steinberg.

YouTube channel: https://www.youtube.com/channel/UC7HbC_nq8t1S9l7qGYL0mTA
Collection: http://www.digiloguemuseum.com/index.html
Emulator: https://sites.google.com/site/capex86/
Raytracer: https://sites.google.com/site/opaqueraytracer/