noshutdown wrote on 2025-01-25, 11:18:
we know that most early vga cards have multiple crystals for different resolutions, the most common ones being 25.2mhz for 640*480*60hz vga mode, and 28.3mhz for text modes, and there can be more crystals for extra modes or ram clock.
800*600*60hz mode requires about 40mhz, while 1024*768*60hz mode needs ~66mhz. does that mean cards without 40mhz crystal would not support 800*600 mode? well, we don't see 66mhz crystal very often because most cards capable of 1024*768 were already using adjustable or integrated clockgen.
Typically, the very early SVGA cards had a 36MHz crystal for 800x600 at 56Hz, which used 35.2kHz hsync, which is sufficiently similar to 35.5kHz hsync of 1024x768 at 43.5/87 interlaced. The interlaced 1024x768 resolution uses 44.9Mhz. Anything more versatile than that usually has clock synthesizer chips instead of dedicated crystals. A 40MHz crystal might be for 800x600 at 60MHz as you suspect, but it also was a typical chip/memory clock crystal.
The original VGA design operated the whole card at a clock derived from the pixel clock, and didn't pre-fetch any video memory data. The first generation of Super VGA cards were based on the same design, just running at higher clocks, thus reqiring even faster memory at the full 32 bit bus width used by the VGA. Later VGA designs on the other hand got an entirely different memory interface that finally used fast page mode for burst read and writes, and they read something like 8 32-bit words at once into a display FIFO, leaving the time between FIFO refills to PC writes, which were buffered in write FIFO and in case of page hits also drained using fast page mode. As you see, the two most critical operations, namely CPU writing to video memory and display reading from video memory are now asynchronous and decoupled using FIFOs/Queues, so the clock rate of the memory can be independent from the pixel clock, allowing fast video writes at high memory clocks even at low resolution modes, or use the improved efficiency of the memory interface to be able to drop the memory bus width down to 8 or 16 bits without being slower than the original VGA design on video memory writes (which, admittedly, doesn't mean much). 40MHz was a very widespread clock for the memory, also called "system clock" by some manufacturers, and is basically universally used on ET4000 graphics cards. IIRC there were some clock synth chips for VGA cards that use 40MHz as reference clock instead of the standard 14.318MHz reference clock, because "you have 40MHz anyway".