VOGONS


Reply 20 of 36, by Chadti99

User metadata
Rank Oldbie
Rank
Oldbie

Okay try this, first time dumping a rom but some of the ASCII data made me think I got it 🤣. If you have a newer/better rom than this, let me know, I'd like to try it!

Attachments

  • Filename
    CLVGA5429VL20.7z
    File size
    20.62 KiB
    Downloads
    51 downloads
    File license
    Public domain

Reply 25 of 36, by mkarcher

User metadata
Rank l33t
Rank
l33t

It looks like the dumping process worked, but your VGA cards uses the address bits in a non-conventional way. I split your file into two halves of 16K each, and interleaved them, obtaining a valid Video ROM. Here is the result.

About the resolutions: Consumer-grade VGA cards (I consider cards like the S3 9xx series professional grade) of that time usually work on 8-bit pixels only. If you run 800x600 at 16-bit color, most parts of the card work as if you were using 1600x600 with 256 colors, and only the last step in the RAMDAC combines two virtual 8-bit pixels to one 16-bit pixel. This means the card runs at twice the dot clock you need for that mode. In true-color modes, the card runs at thrice the desired dot clock. This limits the dot clock to half of the specified maximum clock on 16-bit modes, and a third on 24-bit modes. The original IBM VGA even worked on 4-bit pixels, and already used a "pixel merging" scheme in the 256-color mode. That's why the card can do 640/720 pixels in text mode and 16-color graphics modes, but only 320/360 pixels in 256.color mode.

For your amusement: Trident carried on with 4-bit pixels for a long time, and early Trident true-color capable cards run at six times the dot clock required for that mode. This is the reason why some trident card have true-color modes at 28kHz or 29kHz HSync, because they don't go high enough to reach standard VGA frequencies (31.5kHz HSync).

Attachments

  • Filename
    CLVGA5429VL20b.7z
    File size
    17.48 KiB
    Downloads
    62 downloads
    File comment
    Cirrus Logic CL-GD5429 BIOS, non-interleaved
    File license
    Fair use/fair dealing exception

Reply 28 of 36, by Disruptor

User metadata
Rank Oldbie
Rank
Oldbie

mkarcher is examining the card in detail now, since we found a 1 byte difference in the BIOS (another byte is in the checksum tough).
It uses the same scheme with interleaved bytes in the raw BIOS.
My card has a memory timing with 60 MHz. The difference in your card defines a timing of 50 MHz.

Reply 29 of 36, by mkarcher

User metadata
Rank l33t
Rank
l33t
Chadti99 wrote on 2021-10-06, 19:30:

Okay try this, first time dumping a rom but some of the ASCII data made me think I got it 🤣. If you have a newer/better rom than this, let me know, I'd like to try it!

I compared your ROM with with the ROM on Disruptor's card. It turns out that the ROM code is identical, but the card initialization data is slightly different. Disruptor's card initializes the memory clock to 60 MHz (actually 60.852, the maximum mentioned in the datasheet), whereas your ROM initializes the memory clock to 50MHz (actually 50.114, the highest value generally recommended by Cirrus Logic, but it seems the 5429 got some kind of blessing for operation up to 60MHz).

I measured the performance of byte writes and word/dword writes in different modes (text mode, 16-color VGA mode, 256-color VGA mode and 800x600, 256 color SVGA mode) at different memory clocks and 40MHz VLB clock. The results are shown in the diagram below. There is a single line for "VGA byte writes" because the performance for byte writes is identical in VGA 16-color modes and the VGA 256-color mode. The BIOS refused to initialize 800x600 with 256 colors at memory clock settings below 44MHz. It seems that starting at 57MHz memory clocks, performance gains start to be marginal, but the jump in word write performance in 16-color modes is impressive. The vertical lines mark the settings of 50MHz and 60MHz as initialized by the different ROMs.

CL-GD5429-performance.PNG
Filename
CL-GD5429-performance.PNG
File size
71.75 KiB
Views
657 views
File license
Public domain

Reply 33 of 36, by mkarcher

User metadata
Rank l33t
Rank
l33t
Chadti99 wrote on 2021-10-10, 19:15:

Interesting, should I try Disruptor’s bios to try and squeeze and extra frame or two out of my card?

If you want to try running the video RAM at higher speeds than provided by your BIOS by default, you can try the generic Cirrus memory clock adjustment tool I just coded, and posted about in Tool to adjust the memory clock on Cirrus Logic CL-GD542x graphics cards. The changes made by this tool persist until overwritten by another tool (IIRC Cirrus provided MCLK40 and MCLK44 to set 40 and 44 MHz at some time), or the system is reset. You can invoke the tool from autoexec.bat to get select a higher clock if you want. As Disruptor already mentioned, 50MHz is the highest speed that operates typical 70ns RAM in specification, but as long as your RAM is cooler than the maximum permitted temperature (usually 70°C), it is likely that it responds faster than guaranteed, and might work well enough at 52 or 54 MHz.

If you desire, I can provide you a patched BIOS image with a higher memory clock, but I highly recommend testing with this non-persistent tool before re-programming a EPROM chip.

Reply 34 of 36, by maxtherabbit

User metadata
Rank l33t
Rank
l33t
mkarcher wrote on 2021-10-07, 05:56:

About the resolutions: Consumer-grade VGA cards (I consider cards like the S3 9xx series professional grade) of that time usually work on 8-bit pixels only. If you run 800x600 at 16-bit color, most parts of the card work as if you were using 1600x600 with 256 colors, and only the last step in the RAMDAC combines two virtual 8-bit pixels to one 16-bit pixel. This means the card runs at twice the dot clock you need for that mode. In true-color modes, the card runs at thrice the desired dot clock. This limits the dot clock to half of the specified maximum clock on 16-bit modes, and a third on 24-bit modes. The original IBM VGA even worked on 4-bit pixels, and already used a "pixel merging" scheme in the 256-color mode. That's why the card can do 640/720 pixels in text mode and 16-color graphics modes, but only 320/360 pixels in 256.color mode.

That's new to me. Was this still a thing in early PCI cards too?

Reply 35 of 36, by mkarcher

User metadata
Rank l33t
Rank
l33t
maxtherabbit wrote on 2021-10-12, 03:02:

That's new to me. Was this still a thing in early PCI cards too?

I don't know of any PCI card that has an internal 4-bit path, and I highly doubt, one exists. The basic fact that you need twice the memory bandwidth in 16-bit modes compared to 8-bit modes of course still is valid in any graphics cards up to today. On the other hand, starting with integrated RAMDACs, the 8-bit path between the graphics chip and the RAMDAC is easily expandable to higher widths. Even before integration of the RAMDAC, some chips like the ET4000 had an optional mode to run the RAMDAC interface at double data rate to overcome the issue. High-end graphics cards often used VRAM and passed the data (in SVGA modes) from the VRAM to the RAMDAC avoiding the graphics chip alltogether (that's the main point of the S3 9xx series!) The RAMDAC data path is often 32 bits wide. I own a S3 928 graphics card that uses bank interleaving, providig a virtual 64-bit path by alternating between two VRAM banks for even higher bandwidth.

As a direct answer to your question: I suppose (but I don't know) that S3 805 cards (even the PCI variants with the 805p) have an 8-bit interface between the S3 chip and the RAMDAC.

Reply 36 of 36, by mkarcher

User metadata
Rank l33t
Rank
l33t
maxtherabbit wrote on 2021-10-12, 03:02:

That's new to me. Was this still a thing in early PCI cards too?

I just probed the output of the DAC / clocksynth on a SPEA V7 MIRAGE VL (S3 805p). In text mode, the clock outpus are 50MHz (memory clock) and 28.3MHz (pixel clock). In 640x480 24bpp @ 60Hz, the clock outputs are 50 MHz (memory clock) and 75 MHz (pixel clock). The actual pixel clock is 25MHz at that resolution, so triple clock for true-color modes definitely is a thing on that card. I know, it's not PCI, but the 805p is PCI capable and won't behave differently on PCI cards.