VOGONS

Common searches


First post, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t

It was early 2000s, that I saw the introduction of the 24-bit color format after the long era of 16-bit color 3D gaming. But the 24-bit color era was, to be honest, a short lived advertisement. In a few years the 32-bit color format, or in other terms the ARGB8888 format took over.

This was also prevalent on desktop color modes, the 24-bit format vanished mostly. All people had with their HW accelerated display drivers was Low color (8-bit), High color (16-bit) or True color (32-bit) modes.

In modern Windows, the 8 and 16-bit modes are done in emulation. The 24-bit mode is not enumerated, but DxWnd requests the 24-bit mode as well and Windows does emulate it like the other modes. Why did the alpha channel become so important, that 24-bit color felt like just an idea in dream?

previously known as Discrete_BOB_058

Reply 1 of 8, by weedeewee

User metadata
Rank l33t
Rank
l33t

the way I recall it, 24 bit was available several years earlier.
main reason for it was 16M colors. better photos on your computer.
24b coz 3x 8 = 24
reason for going to 32 bit
memory became cheaper, and transfer speed of 32 bit vs 24 bit was higher due to alignment, as in PCs are either 8, 16, 32, 64 , ... bits (lets not talk about the older & other ones)
alpha channel for transparancy functionalities.

edit: alpha channel transparancy functions were mostly for hig end cards in those early days. The fourth 8 bits were mostly just discarded .

Last edited by weedeewee on 2023-08-02, 18:04. Edited 1 time in total.

Right to repair is fundamental. You own it, you're allowed to fix it.
How To Ask Questions The Smart Way
Do not ask Why !
https://www.vogonswiki.com/index.php/Serial_port

Reply 2 of 8, by Disruptor

User metadata
Rank Oldbie
Rank
Oldbie

Well, memory was expensive those days
Transfer speed to DAC was also an issue

640x480x24bpp needs 1 MB RAM, x32bpp needs 2 MB
1280x1024x24bpp needs 4 MB RAM, x32bpp needs 8 MB

Reply 3 of 8, by Scali

User metadata
Rank l33t
Rank
l33t
weedeewee wrote on 2023-08-02, 15:39:
the way I recall it, 24 bit was available several years earlier. main reason for it was 16M colors. better photos on your comput […]
Show full quote

the way I recall it, 24 bit was available several years earlier.
main reason for it was 16M colors. better photos on your computer.
24b coz 3x 8 = 24
reason for going to 32 bit
memory became cheaper, and transfer speed of 32 bit vs 24 bit was higher due to alignment, as in PCs are either 8, 16, 32, 64 , ... bits (lets not talk about the older & other ones)
alpha channel for transparancy functionalities.

edit: alpha channel transparancy functions were mostly for hig end cards in those early days. The fourth 8 bits were mostly just discarded .

Yup, exactly that. Was going to add that usually you use XRGB modes, not ARGB, so the top 8 bits are don't care. But you already added it.
So just going to say: I second this.
In early days memory was expensive, so you wanted efficient storage. 24-bit was too slow for realtime animation anyway, so you mainly displayed still images.
When memory cost became less of an issue, they could optimize for speed, and for that 32-bit layouts are much better.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 4 of 8, by Azarien

User metadata
Rank Oldbie
Rank
Oldbie

I had a Cirrus Logic card with 24-bit support in early 1995. With just 1 MB it could only support 640x480 in 24-bit mode, with no double buffering, Windows 3.11.

Scali wrote on 2023-08-02, 18:14:

Yup, exactly that. Was going to add that usually you use XRGB modes, not ARGB, so the top 8 bits are don't care. But you already added it.

I'd say it is XRGB for the framebuffer, and ARGB for textures, as the alpha channel has little use on the screen.

Reply 5 of 8, by leileilol

User metadata
Rank l33t++
Rank
l33t++
BEEN_Nath_58 wrote on 2023-08-02, 15:17:

It was early 2000s, that I saw the introduction of the 24-bit color format after the long era of 16-bit color 3D gaming. But the 24-bit color era was, to be honest, a short lived advertisement. In a few years the 32-bit color format, or in other terms the ARGB8888 format took over.

i've generally only seen 24-bit color modes on VLB cards. It should be very dead by 2000 with 32-bit superceding it (for performance reasons since PCI). On PCI I think only Virge tried to bother with 24-bit 3d (while, being a virge, is not a good idea). PowerVR cards render in 24-bit internally regardless of target mode (this was a Kyro marketing bullet later).

24/32-bit 3d wasn't a new 2000s innovation, it was there earlier and the dominant 3dfx support bubble + console ports made it appear like it never was.

apsosig.png
long live PCem

Reply 6 of 8, by Scali

User metadata
Rank l33t
Rank
l33t
Azarien wrote on 2023-08-02, 21:06:

I'd say it is XRGB for the framebuffer, and ARGB for textures, as the alpha channel has little use on the screen.

Yes, but in the days when 24-bit and 32-bit first arrived, we didn't have 3D acceleration or textures yet.
You could perform blits with alphachannel on select hardware though. But I think the early low-end cards with 32-bit support didn't even have any kind of hardware blitter at all. So if any kind of alphablending was done on those, it was done by the CPU anyway.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 8 of 8, by mkarcher

User metadata
Rank l33t
Rank
l33t

An interesting consideration that has not yet been raised in the thread is the interface between the VGA chip and the DAC (especially assuming the stuff isn't integrated). Traditionally, this is an 8-bit interface, and all data from video memory is sequentially sent over this interface (that's also what Disruptor refers to when talking about "transfer speed to the DAC"). A XRGB mode wastes 25% of the available bandwidth between the VGA chip and the DAC, if every byte is transferred individually.

Quite soon after introducing 16bpp modes, DACs were built that could perform double-data-rate transfer in these modes, i.e. transfer 8 bits on the raising edge and 8 bit on the falling edge of the clock. This essentially converts the interface from a 8-bit interface to a 16-bit interface (using only 8 connections, though). This interface works perfectly for 16 bpp modes. Other vendors produced DACs that had 16 individual data lines. On VRAM based cards, high-color modes often bypassed the VGA chips alltogether and connected the VRAM pixel data output directly to the DAC pixel data input. In case of 32 bit wide video memory, the low word and high word were interleaved and sent to the DAC one after the other. Those kind of DACs often supported a mode in which they could take 2 8-bit pixels at the same time, allowing quite high resolutions and or refresh rates at 256 colors.

Now, introduce 24 bit color into that picture: No RAMDAC I know of is able to re-synchronize 24bpp data that is passed using a 16 bit pixel bus. There are DACs that allow to take 32bpp using two cycles of 16 bits, but will require 3 cycles of 8 bits to process 24bpp modes. So for these DACs, using 24bpp provides less throughput than 32bpp! Also, there are DACs with a 32-bit interface that allow taking 2 16bpp pixels at once or one 24bpp pixel (ignoring 8 of the 32 input pins, essentially reading the data as XRGB). And now, for most fun, introduce the MiroCrystal 24S. That card is targeted specifically for 24 bit color modes, and uses a VRAM design where data is passed from VRAM directly to the DAC at a width of 32 bits. 24bpp modes are read by the RAMDAC at XRGB. The funny thing about this card is: While the first megabyte of video memory is correctly equipped with 4 chips of 8 bits each, the second, third and fourth "megabyte" of videomemory does not consist of 256kilowords of 32 bits (which would be a megabyte indeed), but miro left out the memory chips that would be mapped to the X bits of the DAC. They only have 256 kilowords of 24 bits, so only a total amount of 768KB is available in the 1MB spaces of the 2nd, 3rd or 4th megabyte. This means the video memory looks like 32bpp to the CPU, but you only pay for 24bpp worth of video memory. It also means that for all modes except the 24bpp modes, only the first megabyte of video RAM is usable. Actually, the design is even more convoluted, so the direct-VRAM-to-DAC mode is not usuable for anything except 24bpp modes. Thus, the card behaves like a simple 1MB card with an 8-bit VGA-to-DAC interface as well like a 4MB 32bpp card in True Color modes with a high-performance memory interface, while there is only 3.25MB soldered to the card.