VOGONS


How is 16-bit dithering controlled?

Topic actions

First post, by maximus

User metadata
Rank Member
Rank
Member

This has always been a bit of a mystery to me. Who controls 16-bit dithering: games, drivers, or hardware?

I know that some video cards support 16-bit dithering and some don't, and that quality varies between cards. However, some games just refuse to use dithering regardless of hardware support. Do drivers also come into play? Is there a way to force 16-bit dithering on or off at the driver level?

PCGames9505

Reply 1 of 22, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

some games just refuse to use dithering regardless

Examples?

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 2 of 22, by maximus

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote:

Examples?

Croc: Legend of the Gobbos comes to mind. I've never seen it use dithering in Direct3D mode, even on hardware that definitely supports it.

I feel like I've seen color banding in other games as well (Moto Racer, Need for Speed: Porsche Unleashed, possibly Rogue Squadron), but I can't be sure if those were hardware-specific problems or not. Like I said, it can be a little mysterious.

My working theory is that some cards support 16-bit dithering and some don't, but that games also have the ability to force dithering on or off. What I'm not sure about is whether drivers have the ability to control dithering dynamically, or if some drivers disable dithering completely even if the hardware supports it.

PCGames9505

Reply 3 of 22, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

I've seen some games have this as an option to toggle on and off, but I never played around with it.

YouTube, Facebook, Website

Reply 4 of 22, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I think all cards from those days have some sort of dithering in use. It's either error diffusion (looks noisy/grainy) or ordered dithering (tends to be blocky/banded). Some graphics cards have options. ATI used to support both ordered dithering and error diffusion dithering (image). 3dfx cards all have options to control their RAMDAC 16-bit post processing. Voodoo4/5 are the most sophisticated (info). SSAA also effectively dithers the image.

There was a time in the early D3D10 card era when nobody seemed to support 16-bit dithering and old 16-bit color games were extremely banded. This is what 8800 produced in 2007 (image). I think ATI and NV both improved their legacy support over time.

Reply 5 of 22, by Azarien

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:

There was a time in the early D3D10 card era when nobody seemed to support 16-bit dithering and old 16-bit color games were extremely banded. This is what 8800 produced in 2007 (image). I think ATI and NV both improved their legacy support over time.

Was that an improvement on the driver side or are there cards with such ugly 16-bit support that cannot be fixed?

Reply 6 of 22, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Both Nvidia and AMD dropped 16-bit dithering completely. Most likely they even can't do 16-bit mode and just internally scale it to 32-bit.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 7 of 22, by mrau

User metadata
Rank Oldbie
Rank
Oldbie

i kinda cannot get what i' m not getting here - riva 128 in some magazine back in the day was called the queen of dithering, other vendors had slower or just ugly dithering; i definitely saw this happening in 3dmark99 back then; newer cards may be slower with this as we now use 32bit colors forcibly;

Reply 8 of 22, by swaaye

User metadata
Rank l33t++
Rank
l33t++

With Riva 128 you can really see the dithering. Error diffusion dithering I guess? It's an interesting look but Riva 128 has many problems/limitations. TNT is a big improvement. Here are some 200% upscaled screenshots I took years ago.

qQxca9sO.jpg
wzvm7diL.jpg
More captures here
https://www.mediafire.com/folder/e5r327h8a42n0/PNG_caps

I think the best 16-bit image is probably produced by 3dfx Voodoo3/4/5. Matrox G200/G400 are nice too.

Reply 9 of 22, by swaaye

User metadata
Rank l33t++
Rank
l33t++
The Serpent Rider wrote:

Both Nvidia and AMD dropped 16-bit dithering completely. Most likely they even can't do 16-bit mode and just internally scale it to 32-bit.

I'm not sure what they do but old 16-bit games look fine these days on my 1070. I don't have an 8800 to mess with anymore unfortunately.

I remember Jedi Knight didn't render correctly when I had my 8800GTX but they eventually fixed it. The image was a corrupted mess. I think at the time ATI and NV were focused on getting the new architectures optimized for contemporary games and eventually went back to legacy games. Perhaps because there was so much complaining on the net.

Reply 10 of 22, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

on my 1070

It support ditherting?
Interesting, probably that's why: http://www.anandtech.com/show/10325/the-nvidi … dition-review/5

Speaking of architectural details, I know that the question of FP16 (half precision) compute performance has been of significant interest. FP16 performance has been a focus area for NVIDIA for both their server-side and client-side deep learning efforts, leading to the company turning FP16 performance into a feature in and of itself.

Starting with the Tegra X1 – and then carried forward for Pascal – NVIDIA added native FP16 compute support to their architectures. Prior to these parts, any use of FP16 data would require that it be promoted to FP32 for both computational and storage purposes, which meant that using FP16 did not offer any meaningful improvement in performance or storage needs. In practice this meant that if a developer only needed the precision offered by FP16 compute (and deep learning is quickly becoming the textbook example here), that at an architectural level power was being wasted computing that extra precision.

So anything from 8800 to 980 Ti is out of luck. Same thing goes for any Radeon before Polaris series.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 11 of 22, by spiroyster

User metadata
Rank Oldbie
Rank
Oldbie

In OpenGL land, dithering is implemented by the vendor (ICD). The API gives one option to enable, GL_DITHER, which is infact enabled by default (even in GL 4.5!). The drivers may or may not expose extra dithering options, and it wouldn't surprise me if there were vendor specific extensions for maybe choosing dithering type? (although I can't remember any specifically), but generally this isn't something that a developer needs to concern themselves with unless they are working on that area of interest in a graphics engine (in which case they may not be using directx or GL, or will be using it in a limited capacity anyhow).

There was a time when you could get dithering by forcing the context color buffer to be less than 24bit, but no longer. It would be quite trivia to implement this effect in a fragment shader however.

Last edited by spiroyster on 2017-04-24, 19:45. Edited 1 time in total.

Reply 12 of 22, by swaaye

User metadata
Rank l33t++
Rank
l33t++
The Serpent Rider wrote:

Starting with the Tegra X1 – and then carried forward for Pascal – NVIDIA added native FP16 compute support to their architectures. Prior to these parts, any use of FP16 data would require that it be promoted to FP32 for both computational and storage purposes, which meant that using FP16 did not offer any meaningful improvement in performance or storage needs. In practice this meant that if a developer only needed the precision offered by FP16 compute (and deep learning is quickly becoming the textbook example here), that at an architectural level power was being wasted computing that extra precision.

So anything from 8800 to 980 Ti is out of luck. Same thing goes for any Radeon before Polaris series.

FP16 is a floating point number precision. That's not related. Floating point arithmetic was something that came about with D3D 9 era cards.

Reply 13 of 22, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

FP16 is a floating point number precision.

I know, but only GPUs with that feature suddenly got 16-bit dithering back. I can't get dithering on the Kepler or Maxwell based GPU.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 14 of 22, by spiroyster

User metadata
Rank Oldbie
Rank
Oldbie

Dithering is a process, it's not something you get for free when supporting 16bit types. For free, you get 'colour banding' due to loss of precision of the colour gamut. A 16-bit float represents one value with floating point precision. 16-bit colour refers to 16-bits representing an RGB (3 values, 5,6,5 bits respectively).

The vendor implements dithering, it could be done via software in the driver, or hardware. Traditionally it had to be done fundamentally due to the limitation of hardware's colour buffer. I can't imagine the rationale behind re-implementing this feature in these kinds of modern architectures.

If it was there, then wasn't, then was again, my money would be on the driver implementing this software side mainly, maybe hardware accelerated somewhat (compute), but not entirely in hardware? but idk.

The Serpent Rider wrote:

Translation:

If you are working with values which have no need for the higher precision, you can get performance gains from the hardware because with a native FP16 type, it can do twice the work in the same time as 32-bit float. Plus you can hold twice the amount of data in an array, which means copies/operations can be done on twice as much information in the same amount of time. There are many areas which would not require anything more than 16-bit precision, but RGB colour representation is clearly something which does 😀

Reply 15 of 22, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

If it was there, then wasn't, then was again

Once again - you can't get dithering on Fermi, Kepler and Maxwell GPUs with fresh drivers. Period. So it seems like anything after G80 and before Pascal series just can't render 16-bit modes.
Also you still get dithering on any card before G80 with driver that support it.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 16 of 22, by swaaye

User metadata
Rank l33t++
Rank
l33t++

It just occurred to me that I have a 8600GT at home. I will experiment with it and also look closer at what the 1070 does. I only tried AVP2 at 16-bit color and that looked fine.

Another interesting thing to think about is Android devices. Some Android games use 16-bit color depth because it helps slow GPUs. I have seen interesting differences between hardware. For example, Intel's Atom Baytrail GPU seems to dither 16-bit color depth. I also have Tegra 2, 4 and K1 devices and the K1 is quite banded with 16-bit color depth. Don't remember what Tegra 2 or 4 look like...

Reply 17 of 22, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Ok. Nevermind Pascal/1070 having dithering. 😀 It looks like in Windows 10 when you use 16-bit color that somehow the OS is converting it to a 32-bit mode. Sometimes programs don't enumerate 16-bit modes at all. And when you can choose 16-bit, it performs quite poorly most of the time, depending on what is being rendered.

But I have Windows 7 dual boot so here is ugly non-dithered 16-bit color from a GeForce 1070.
y2Tf5v4H.jpgGr5vd1jD.jpg 1RzhChrs.jpg
4xn15M46.jpg 6Idur40O.jpg

Reply 18 of 22, by leileilol

User metadata
Rank l33t++
Rank
l33t++

It should still be a 32-bit buffer. If you drop some sort of dither shader in something like a reshade for those 16bpp games you should be able to get dithering again (but in the form of post-dithering, which is different than typical texture dithering)

apsosig.png
long live PCem

Reply 19 of 22, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yeah I've seen the dither shader in Reshade before. I'm not sure how effective that is....

Anyway here is a Radeon HD 6950 doing the same programs in 16-bit color. No dithering again. This was in XP on my nForce4 box.
zz0vf6s5.jpg wPmpN1C6.jpg
AdNBffeL.jpg