superfury wrote on 2021-06-29, 16:14:
It didn't seem to mention anything I didn't already implement?
Edit: Just fixed an issue with the EGA 64 color palette (incorrect 1/3 and 2/3 shades of R/G/B).
Edit: Although the DAC seems fine (looking at the colors after a dump), the 16-color palette in the attribute controller doesn't seem correct? It's programmed with 0-7, followed by 10h-17h? Isn't that supposed to be at the end of the 64-color range (37h-3Fh)?
That might be caused by your DIP switch settings. You mentioned that you used DIP configuration 01 (or 0Eh inverted). This is "Primary card MDA, secondary card EGA with color monitor at 80x25". If there is no MDA installed, this configuration automatically falls back to "EGA with color monitor". The important point at this moment is "color monitor". This refers to the IBM 5153 CGA monitor, not the IBM 5154 EGA monitor. That monitor would be called "enhanced monitor", and you should use DIP configuraion 03 (or 0Ch inverted) to drive it in 350-line high-resolution text mode.
The IBM 5153 is a 16-color monitor, and the EGA color bits are mapped to the monitor pins in a way that 0-7 followed by 10h-17h outputs the 16 bit patterns the 5153 monitor recognizes (AKA: The I pin of the IBM5153 is mapped to the secondary green signal of the EGA card). In "mode 1" (200 scanlines), the EGA monitor behaves CGA-like, and falls back to a 16-color mode. It behaves virtually identical to the IBM 5153. Only in "mode 2" (350 scanlines), the power of 64 colors is unleashed. To get the monitor into "mode 2", you need to set a high-resolution graphics mode (0Fh/10h) or text mode with 350 scanlines.
In mode 1, the color values you observe are exactly what is expected. The magic brown correction (brown is #AA5500, although the RGBI pattern looks like #AAAA00) is performed by the monitor. In mode 2, the color values are 0-5, 20, 7, 37h-3Fh. The magic brown correction is done by outputting the correct RGBrbg pattern for #AA5500 (primary red, secondary green, no blue).