VOGONS


Reply 60 of 66, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++
Putas wrote:
philscomputerlab wrote:

Read a Matrox G400 review recently and it mentioned that their 16 bit mode calculates everything internally in 32 bits

That was the norm, 16 bit was never a proper graphics standard. The interesting question is which 3d accelerators, if any, where internally capped to 16 bits only?

I see. When I read the article, it seemed like something special 😀

I got the impression that the G400 renders 16 bit games the nicest. I will check it out once I benchmark the Matrox cards.

Last edited by PhilsComputerLab on 2015-11-08, 08:45. Edited 1 time in total.

YouTube, Facebook, Website

Reply 61 of 66, by Scali

User metadata
Rank l33t
Rank
l33t
Putas wrote:

That was the norm, 16 bit was never a proper graphics standard. The interesting question is which 3d accelerators, if any, where internally capped to 16 bits only?

The confusion is probably because there's a difference between the ALU used for rendering operations and the data format used for textures and framebuffers.
Just like with CPUs, once you have an ALU capable of 32-bit, it is no slower to perform 32-bit calculations than it is to perform 16-bit calculations.
Therefore, video chips would always process textures, lighting etc with 32-bit anyway. The thing is just that loading a 16-bit texture was twice as fast as loading a 32-bit texture, because you only had to send half the data over the same memory interface.
Likewise, storing data in a 16-bit framebuffer was twice as fast as a 32-bit framebuffer (the dithering was just some fixed-function unit, so it was pipelined into the design, and didn't result in longer rendering times. Just like fetching data from a texture was implicitly converting it to 32-bit 'for free').

Now, all conventional 3d accelerators will just load textures, perform the lighting/blending operations on these textures, then perform the blending with the framebuffer (if any), and store the result on a per-pixel basis.
So you get 16-bit -> 32-bit -> 16-bit.

PowerVR is the exception to the rule here. Because it is a tile-based renderer, it doesn't actually render to videomemory. It renders to the tile cache. This tile cache is always a 32-bit buffer. So that means that it only goes 16-bit -> 32-bit (if you use 16-bit textures that is). The pixels remain 32-bit in the tile-cache, and blend operations are also done in 32-bit.
It only does 32-bit -> 16-bit once the tile has finished rendering, and is stored to the final framebuffer in video memory.

This only works because the tile cache is much faster than video memory. For conventional renderers this wouldn't make sense. They would have to put the 'temporary' framebuffer in videomemory as well. In which case they'd run into the same bottlenecks as with full 32-bit rendering, so they would only be slower in doing so.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 62 of 66, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I've long wondered just what these do for Voodoo1 and 2.
SST_VIDEO_24BPP
SSTV2_VIDEO_24BPP

They seem to just be related to enabling gamma correction from what I can glean from Google Groups.

Reply 64 of 66, by rmay635703

User metadata
Rank Oldbie
Rank
Oldbie
leileilol wrote on 2015-10-28, 19:42:

16-bit has a green tinge due to the bit depth itself (6 bits green, 5 bits red and blue). Now that I've mentioned it, you'll notice it far more than you'd like now. It's the green that can't be unseen! It's like The Matrix! And despite a particular claiming otherwise, it's not lossless!!!

Speaking of alpha color

When I was a kid 15 bit color was a thing
The Macs in the lab had that as the highest setting (hi/low 16/15 bit color must have just been a Mac thing though it was always mentioned in software mags)

16 bit high color with the last bit setting a 256c pallet or setting hi/low black and white (alpha) was also a thing in the documentation for various video card drivers

I guess why was green chosen to be an alpha bit in the 3D era?

Grayscale was many times the original alpha bit in earlier implementations and seems more logical to stop color mismatch issues, I would think that would be more important than the ma eyes can see more shades of green argument since ma eyes can also see more grayscales for the same reason

Reply 65 of 66, by Jo22

User metadata
Rank l33t++
Rank
l33t++

Green seems to have always have been a thing of some sort.
It was considered eye-friendly since the days of glowing clock faces, magic eyes, radar screens and oscilloscopes.
It also was popular in computing in the form of glass terminals and early video monitors (green monitors).
In either case, phosphor had a certain amount of afterglow.

"Green" also has a relevance in video equipment.
There's "sync-on-green" pin, which can give a green video output if feed with a filtered composite video signal,
thus mimicking a green monitor. 😉

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 66 of 66, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Green seems to have always have been a thing of some sort.

That's because we perceive green color spectrum better than blue or red. Evolution thing.

I must be some kind of standard: the anonymous gangbanger of the 21st century.