Reply 380 of 457, by NewRisingSun
CGA monitors include my PCjrs and Tandys and evan an AT&T D25 double-scanrate CGA monitor and they *all* show brown for #6. And that's the way I've always remembered it for two decades.
You're missing the point. I'm not doubting that your current 5153 are all brown. I am doubting that *all* 5153s in existence are brown. You can show me a hundred 5153s that are all brown; it still does not address the point whether there are some (early) 5153s that are yellow.
We are seeing a picture in this very thread that shows a monitor that is not brown, and I am rejecting the cheap explanation that the photo is rigged/badly done or that the monitor is so misadjusted that red and green would display correctly, but brown wouldn't. I also am doubtful of the explanation that it's the intense color's we're seeing, because I have never seen the game displaying the high score screen in intense colors. Also, when compared with the room lightning, the colors look too dark to be the intense ones.
So, the facts are: we are seeing a picture of a 5153 that is yellow. Most 5153 are brown. How can this be explained?
If you have followed the discussion so far, the hypothesis has been thus: the original 5153 was yellow, but when the EGA was introduced, it was modified to display brown instead. You mentioned that you got into PCs in 1985, which is well after the introduction of the EGA, so any 5153 you might have bought would have been the "revised" model, if there was one. Do you understand what I'm saying? You don't have to agree with it, just tell me if you get my point.
Why would the introduction of EGA be of such importance? Because the CGA's RGBI color model does not allow for brown. Bit 0 sets B to AA, bit 1 sets G to AA, bit 2 sets R to AA, bit 3 adds 55 to R,G,B. Color "6" has bits 1 and 2 set, therefore, R and G have a value of AA, therefore, the color is #AAAA00. Brown would be #AA5500, which is not possible in the RGBI color model. Any RGBI monitor displaying brown MUST have some kind of "hack" to detect color 6 and artifically lower the green component.
The EGA on the other hand uses the 6-bit ECD color model. Bit 0 sets B to AA, bit 1 sets G to AA, bit 2 sets R to AA. Bit 3 sets B to 55, bit 4 sets G to 55, bit 5 sets R to 55. Brown, being #AA5500, therefore needs to have bits 2 and 4 set, which results in the value 0x14, which is indeed the palette entry for attribute 6.
In other words, while the RGBI color model does not allow for brown, the EGA color model does. My hypothesis is that when the EGA came about and people used it with the 5153 instead of the more-expensive 5154, IBM added a "hack" to the 5153 so that there would be no color difference between the two monitors.
Of course, it could also be that the "hack" was there from the start, and the EGA finally had the color model to properly reflect that RGBI-violating "hack". I reject this variant because:
- we are seeing a picture of a display that does not have this hack (see above on why I think the picture displays what is claimed);
- the CGA's composite output undoubtedly displays color 6 as yellow, NOT brown. Why would the composite output follow the RGB output in all colors but one?