Reply 20 of 37, by mkarcher
VileR wrote on 2020-09-30, 13:02:
(Continuing the video-related discussion. Clearly off-topic by now; I'd ask a moderator to split it, but I know it's an annoying bunch of work). 😀
Inspired by our discussion, I made a photo documentation of the tear-down of one of the PS/2 model 30 computers I had at hand at the time. The access window is closed now, but I will get back to the machines some months later. I photographed every component from every interesting angle, and especially the IBM part numbers on them (I wonder, maybe I forgot to shoot detail photos of the ISA riser stabilisation bracket). I took photos of the mainboard and stitched them together. It seems that even though there are visible stitching artifacts at one place, they can be used for tracing out a lot of stuff (of course, no info about the traces below the chips. I didn't have a good continuity beeper at hand, so no detailed trace-out how the chips interoperate on the system yet.
I can tell you though, that the R/G/B output pins on the VGA connector are directly connected to the IMSG171 RAMDAC and to 150Ohm pull-down resistors, so clearly no TTL-level signals on those pins. The R/G/B return pins are connected to the ground pin of the RAMDAC with separate traces. The ground pin of that chip is a star grounding point for the analog video signals. IBM spared no expense there. It is very unlikely to find a digital RGBI signal on the VGA port, as you have 6 pins for RGB+return, 2 pins for sync, one pin as digital ground, which already occupies 9 out of 14 pins (one pin is a key pin). I counted the traces to the VGA port, and there are 11 traces, so no way to have digital RGB anywhere on that connector.
After reassembly, the system obvious POSTed (floppy drive seek was audible and a double beep after that, either for "keyboard missing" or "RTC battery flat"), but I had no time to clean and inspect the 8512 monitor. The modern Super-VGA monitor I had at hand didn't fit the keyed VGA socket, so no pictures yet. BTW: I quite like the ASCII-art pictures for "unknown time" and "insert floppy and press F1 to continue" as shown by the PS/2 model 30 computers. A final side note (again grinding towards the thread topic again): The power supply of the PS/2 model 30 I tore down (which is distinctively different from the power supply in the "executive workstation", already starting at the form factor) uses hex screws without a security pin. The 2mm bit is too small, the 3mm bit is too big, so the screws are either 2.5mm or imperial (like 3/32). I could unscrew them by abusing a T10 bit, though. I still don't have pictures of the inside of the power supply (although I would have liked to, as I remember seeing a label "Warning! Heat sink is live!" through the ventilation gratings), because in addition to three hex screws, the halves of the power supply are also joined using rivets. For the relief of the alarmed reader: There was no Dremel in reach.
Do you think the internet values a well-illustrated Model 30 teardown, or would I waste time making a nice illustrated teardown/reassembly guide for the PS/2 models? I didn't research into it, but hardware service manuals containing service instructions and part numbers might be available, just as they are (were?) available for the classic thinkpads, for example.
VileR wrote on 2020-09-30, 13:02:mkarcher wrote on 2020-09-29, 08:46:
A consequence of the 16-bit design of the Hercules card is that the CRTC of the hercules card runs at a character clock that is the dot clock divided by 16, essentially it produces 45 "characters", each 16 pixels wide, to get to 720 pixels per line. On the other hand, the CGA card in monochrome graphics mode runs the CRTC at dot clock divided by 8, so it is programmed to produce 80 "characters", each 8 pixels wide, to obtain 640 pixels per line. The 320-pixel mode runs the CRTC at the same timing, it just "merges" to 1-bit pixels to one 2-bit pixel in the output circuit (more in depth, its basically the other way around: the 640 mode splits two-bit packets into two single pixels).
Appreciate the info - I never looked too deeply into the Hercules monographics stuff, but I did notice those 16-pixel wide "characters" when I had to find out something or other related to timing. So that explains why they went for that odd-looking scheme (and also, I think, why the H/V refresh rates for Hercules are subtly different from the MDA ones... I suppose they were lucky that 5151s and the like *did* have a bit of tolerance after all).
The Hercules CRTC setup in text mode is identical to the MDA CRTC setup, and thus the pixels per scanline and the scanlines per frame are identical on MDA text mode and Hercules text mode. But Hercules opted to choose the more widely available 16.000MHz quartz oscillators as clock source, whereas IBM for some reason (If anyone knows, please speak up!) went with a 16.257MHz oscillator.
VileR wrote on 2020-09-30, 13:02:
About CGA, that makes sense, since after all the data rate for the 320/640-pixel graphics modes is the same. The clock need only be doubled for 80-column text mode, so that two bytes (character + attributes) are transferred for each character period.
As you may be aware, with that doubled clock rate, other timings are not necessarily adjusted to compensate - the hsync pulse width for one is effectively halved, unless manually reprogrammed. For the composite output, this means it no longer fully overlaps with the NTSC color burst... which is effectively ANDed with the hsync pulse. This single fact caused no end of headache with the color output for 8088 MPH, and is the reason for the 'calibration' screen you get at the beginning.
I'm still wondering whether that was a bug in the CGA design, or a feature....
My understanding of the CGA hardware differs from my understanding of your text. As far as I know, the CRTC character clock (the CRTC known nothing about horizontal pixels) runs at full rate in all modes except the 40-column text mode. The memory cycle time is the same in all modes, whereas the bandwidth requirement is low in 40-column text mode as well as in graphics mode, but high in 80-column text mode. The ISA bus can use every second memory cycle (let's call them the odd cycles). In low-bandwidth mode, the CRTC just uses the even cycles, so the CRTC and the ISA bus don't interfere. In high-bandwidth-mode, the CRTC needs every cycle during the display period, but the ISA bus has "priority" to the memory. An ISA cycle pending at on odd cycle preempts the CRTC address from the memory address lines, so the scan-out logic gets to see the data byte that was read or written by the processor instead of the data byte requested by the CRTC, and that's the snow you can get.
As the sync pulse width is derived from the CRTC character clock and not the pixel clock, the overly short sync pulses should occur on all modes except mode 0/1, of which only mode 1 needs a working color burst pulse. I think IBM didn't care about a broken color burst in mode 3, because 80-column text on composite output is a bad idea anyway. Mode 5/6 are without color burst anyway, but for mode 4 a working NTSC color burst is very much intended and it seems clearly like a bug if that mode has the broken short pulses.
VileR wrote on 2020-09-30, 13:02:
Now about the font loading: It differs from the EGA/VGA model in nearly every possible aspect. That's not too surprising in the end, as the EGA/VGA model is optimized for video display using the font, whereas the MCGA model is optimized for being transferred by the font loading DMA engine. Font loading seems to work like this:
Woah. Very well, I'm now fully convinced of the weirdness of MCGA. 😀
Since it's clearly a specific and optimized design on its own, rather than just a simple budget-version VGA (or an extended CGA), it's a bit puzzling that IBM expended all this engineering effort on a device intended for the budget low-end models... ones which were otherwise almost-obsolete on introduction!
I guess they wanted to push the XT clones out of the low-end market by providing an entry level system with increased performance (8086 instead of 8088) and improved graphics (256 colors!, 640x480!) as competitor. This would never have taken off (indeed, did it really take off?) if it wasn't software compatible with some already established graphics solution (VGA in this case), on the other hand, they couldn't fit the whole VGA thing on the mainboard. As you might be aware of, the "chipset" of the PS/2 model 30 does not consist of classic ASICs, but of mask-programmed gate arrays (two of them seem to make up the MCGA). The space in those chips is limited, so VGA might have needed more chips (imagine the complexity of the graphics controller!). And space for 8 RAM chips on the board.
VileR wrote on 2020-09-30, 13:02:
In the interests of research, this is the one I was referring to - ps2m30-30F9579_80.zip There's also a similarly sized set for […]
The only two PS/2 model 30 ROM sets I can access right now are already on the internet, so no news on that.
In the interests of research, this is the one I was referring to -
There's also a similarly sized set for the Model 35 (even/odd, 64K each). For the 30, I do have sets for 4 different revisions, though only at the more common 32K-per-chip.... and I can't guarantee that they're all good dumps.
Actually, even though I had the systems in the house since 25 years, it never occurred to me that they were 8086-based instead of 8088-based. The 8-bit expansion bus made me believe they were just highly integrated XTs. I only realized the 16-bit processor after wondering why on earth they would have an odd/even split of the bios in an 8-bit computer. I would have expected the two bios chips to be low 32K and high 32K instead. I will definitely take a look at the extended PS/2 ROM. Maybe they wanted the fonts for some WYSIWYG application (with serif and non-serif fonts being displayable at the same time).