First post, by Great Hierophant
- Rank
- l33t
True VGA can display 18-bit color, with 6-bits for R, 6-bit for G and 6-bits for B. Modern graphics adapters typically display 24-bit color, with 8-bits for R, 8-bits for G and 8-bits for B. 24-bit color must be able to show more colors than 18-bit color.
I would assume that to convert a 6-bit value into an 8-bit value, the conversion would work like this:
6-bit color value 0 = 8-bit color value 0
6-bit color value 1 = 8-bit color value 4
6-bit color value 2 = 8-bit color value 8
6-bit color value 3 = 8-bit color value 12
and so on. Thus, when displaying VGA graphics, no palette entry should contain any number other than a multiple of 4. But this assumes a linear conversion of graphics.
Now, when using monochrome graphics on VGA, where the color value for R = G = B, only 64 colors should be available, since each color value is 6-bit. IBM indicates that the color values for R, G and B are summed and output on the green VGA pin for a monochrome display. But some games advertise 256 colors with their gray scale drivers on monochrome monitors (Sierra SCI 256-color games, Pinball Fantasies). This was long before SVGA and 8-bit palette DACs. How can they do this?
http://nerdlypleasures.blogspot.com/ - Nerdly Pleasures - My Retro Gaming, Computing & Tech Blog