VOGONS


MCGA monitor at 15 KHz?

Topic actions

Reply 20 of 41, by mkarcher

User metadata
Rank l33t
Rank
l33t
VileR wrote on 2020-09-30, 13:02:

(Continuing the video-related discussion. Clearly off-topic by now; I'd ask a moderator to split it, but I know it's an annoying bunch of work). 😀

Inspired by our discussion, I made a photo documentation of the tear-down of one of the PS/2 model 30 computers I had at hand at the time. The access window is closed now, but I will get back to the machines some months later. I photographed every component from every interesting angle, and especially the IBM part numbers on them (I wonder, maybe I forgot to shoot detail photos of the ISA riser stabilisation bracket). I took photos of the mainboard and stitched them together. It seems that even though there are visible stitching artifacts at one place, they can be used for tracing out a lot of stuff (of course, no info about the traces below the chips. I didn't have a good continuity beeper at hand, so no detailed trace-out how the chips interoperate on the system yet.

I can tell you though, that the R/G/B output pins on the VGA connector are directly connected to the IMSG171 RAMDAC and to 150Ohm pull-down resistors, so clearly no TTL-level signals on those pins. The R/G/B return pins are connected to the ground pin of the RAMDAC with separate traces. The ground pin of that chip is a star grounding point for the analog video signals. IBM spared no expense there. It is very unlikely to find a digital RGBI signal on the VGA port, as you have 6 pins for RGB+return, 2 pins for sync, one pin as digital ground, which already occupies 9 out of 14 pins (one pin is a key pin). I counted the traces to the VGA port, and there are 11 traces, so no way to have digital RGB anywhere on that connector.

After reassembly, the system obvious POSTed (floppy drive seek was audible and a double beep after that, either for "keyboard missing" or "RTC battery flat"), but I had no time to clean and inspect the 8512 monitor. The modern Super-VGA monitor I had at hand didn't fit the keyed VGA socket, so no pictures yet. BTW: I quite like the ASCII-art pictures for "unknown time" and "insert floppy and press F1 to continue" as shown by the PS/2 model 30 computers. A final side note (again grinding towards the thread topic again): The power supply of the PS/2 model 30 I tore down (which is distinctively different from the power supply in the "executive workstation", already starting at the form factor) uses hex screws without a security pin. The 2mm bit is too small, the 3mm bit is too big, so the screws are either 2.5mm or imperial (like 3/32). I could unscrew them by abusing a T10 bit, though. I still don't have pictures of the inside of the power supply (although I would have liked to, as I remember seeing a label "Warning! Heat sink is live!" through the ventilation gratings), because in addition to three hex screws, the halves of the power supply are also joined using rivets. For the relief of the alarmed reader: There was no Dremel in reach.

Do you think the internet values a well-illustrated Model 30 teardown, or would I waste time making a nice illustrated teardown/reassembly guide for the PS/2 models? I didn't research into it, but hardware service manuals containing service instructions and part numbers might be available, just as they are (were?) available for the classic thinkpads, for example.

VileR wrote on 2020-09-30, 13:02:
mkarcher wrote on 2020-09-29, 08:46:

A consequence of the 16-bit design of the Hercules card is that the CRTC of the hercules card runs at a character clock that is the dot clock divided by 16, essentially it produces 45 "characters", each 16 pixels wide, to get to 720 pixels per line. On the other hand, the CGA card in monochrome graphics mode runs the CRTC at dot clock divided by 8, so it is programmed to produce 80 "characters", each 8 pixels wide, to obtain 640 pixels per line. The 320-pixel mode runs the CRTC at the same timing, it just "merges" to 1-bit pixels to one 2-bit pixel in the output circuit (more in depth, its basically the other way around: the 640 mode splits two-bit packets into two single pixels).

Appreciate the info - I never looked too deeply into the Hercules monographics stuff, but I did notice those 16-pixel wide "characters" when I had to find out something or other related to timing. So that explains why they went for that odd-looking scheme (and also, I think, why the H/V refresh rates for Hercules are subtly different from the MDA ones... I suppose they were lucky that 5151s and the like *did* have a bit of tolerance after all).

The Hercules CRTC setup in text mode is identical to the MDA CRTC setup, and thus the pixels per scanline and the scanlines per frame are identical on MDA text mode and Hercules text mode. But Hercules opted to choose the more widely available 16.000MHz quartz oscillators as clock source, whereas IBM for some reason (If anyone knows, please speak up!) went with a 16.257MHz oscillator.

VileR wrote on 2020-09-30, 13:02:

About CGA, that makes sense, since after all the data rate for the 320/640-pixel graphics modes is the same. The clock need only be doubled for 80-column text mode, so that two bytes (character + attributes) are transferred for each character period.
As you may be aware, with that doubled clock rate, other timings are not necessarily adjusted to compensate - the hsync pulse width for one is effectively halved, unless manually reprogrammed. For the composite output, this means it no longer fully overlaps with the NTSC color burst... which is effectively ANDed with the hsync pulse. This single fact caused no end of headache with the color output for 8088 MPH, and is the reason for the 'calibration' screen you get at the beginning.
I'm still wondering whether that was a bug in the CGA design, or a feature....

My understanding of the CGA hardware differs from my understanding of your text. As far as I know, the CRTC character clock (the CRTC known nothing about horizontal pixels) runs at full rate in all modes except the 40-column text mode. The memory cycle time is the same in all modes, whereas the bandwidth requirement is low in 40-column text mode as well as in graphics mode, but high in 80-column text mode. The ISA bus can use every second memory cycle (let's call them the odd cycles). In low-bandwidth mode, the CRTC just uses the even cycles, so the CRTC and the ISA bus don't interfere. In high-bandwidth-mode, the CRTC needs every cycle during the display period, but the ISA bus has "priority" to the memory. An ISA cycle pending at on odd cycle preempts the CRTC address from the memory address lines, so the scan-out logic gets to see the data byte that was read or written by the processor instead of the data byte requested by the CRTC, and that's the snow you can get.

As the sync pulse width is derived from the CRTC character clock and not the pixel clock, the overly short sync pulses should occur on all modes except mode 0/1, of which only mode 1 needs a working color burst pulse. I think IBM didn't care about a broken color burst in mode 3, because 80-column text on composite output is a bad idea anyway. Mode 5/6 are without color burst anyway, but for mode 4 a working NTSC color burst is very much intended and it seems clearly like a bug if that mode has the broken short pulses.

VileR wrote on 2020-09-30, 13:02:

Now about the font loading: It differs from the EGA/VGA model in nearly every possible aspect. That's not too surprising in the end, as the EGA/VGA model is optimized for video display using the font, whereas the MCGA model is optimized for being transferred by the font loading DMA engine. Font loading seems to work like this:
[...]

Woah. Very well, I'm now fully convinced of the weirdness of MCGA. 😀

Since it's clearly a specific and optimized design on its own, rather than just a simple budget-version VGA (or an extended CGA), it's a bit puzzling that IBM expended all this engineering effort on a device intended for the budget low-end models... ones which were otherwise almost-obsolete on introduction!

I guess they wanted to push the XT clones out of the low-end market by providing an entry level system with increased performance (8086 instead of 8088) and improved graphics (256 colors!, 640x480!) as competitor. This would never have taken off (indeed, did it really take off?) if it wasn't software compatible with some already established graphics solution (VGA in this case), on the other hand, they couldn't fit the whole VGA thing on the mainboard. As you might be aware of, the "chipset" of the PS/2 model 30 does not consist of classic ASICs, but of mask-programmed gate arrays (two of them seem to make up the MCGA). The space in those chips is limited, so VGA might have needed more chips (imagine the complexity of the graphics controller!). And space for 8 RAM chips on the board.

VileR wrote on 2020-09-30, 13:02:
In the interests of research, this is the one I was referring to - ps2m30-30F9579_80.zip There's also a similarly sized set for […]
Show full quote

The only two PS/2 model 30 ROM sets I can access right now are already on the internet, so no news on that.

In the interests of research, this is the one I was referring to -
ps2m30-30F9579_80.zip
There's also a similarly sized set for the Model 35 (even/odd, 64K each). For the 30, I do have sets for 4 different revisions, though only at the more common 32K-per-chip.... and I can't guarantee that they're all good dumps.

Actually, even though I had the systems in the house since 25 years, it never occurred to me that they were 8086-based instead of 8088-based. The 8-bit expansion bus made me believe they were just highly integrated XTs. I only realized the 16-bit processor after wondering why on earth they would have an odd/even split of the bios in an 8-bit computer. I would have expected the two bios chips to be low 32K and high 32K instead. I will definitely take a look at the extended PS/2 ROM. Maybe they wanted the fonts for some WYSIWYG application (with serif and non-serif fonts being displayable at the same time).

Reply 21 of 41, by mkarcher

User metadata
Rank l33t
Rank
l33t
VileR wrote on 2020-09-30, 13:02:

In the interests of research, this is the one I was referring to -
ps2m30-30F9579_80.zip
There's also a similarly sized set for the Model 35 (even/odd, 64K each).

That set you posted there is clearly not for the classic 8530. It expects an AT-like mainboard with an AT keyboard controller and AT CMOS RAM. Most likely, it belongs to a 8230-Exx, the Model 30/286.

Reply 22 of 41, by VileR

User metadata
Rank l33t
Rank
l33t
mkarcher wrote on 2020-09-30, 18:36:

Do you think the internet values a well-illustrated Model 30 teardown, or would I waste time making a nice illustrated teardown/reassembly guide for the PS/2 models? I didn't research into it, but hardware service manuals containing service instructions and part numbers might be available, just as they are (were?) available for the classic thinkpads, for example.

I don't own a Model 30 and never claimed to be the sharpest tool in the shed when it comes to pure hardware, so I don't have a personal stake in answering that, but I'm pretty sure it would be valuable. Service manuals are often incomplete, not to mention unfriendly, and IBM's notoriously so.
The internet as a whole values viral TikTok videos, not research into oldschool hardware, but undoubtedly some individuals will benefit. 😀

My understanding of the CGA hardware differs from my understanding of your text. As far as I know, the CRTC character clock (the CRTC known nothing about horizontal pixels) runs at full rate in all modes except the 40-column text mode. The memory cycle time is the same in all modes, whereas the bandwidth requirement is low in 40-column text mode as well as in graphics mode, but high in 80-column text mode. The ISA bus can use every second memory cycle (let's call them the odd cycles). In low-bandwidth mode, the CRTC just uses the even cycles, so the CRTC and the ISA bus don't interfere. In high-bandwidth-mode, the CRTC needs every cycle during the display period, but the ISA bus has "priority" to the memory. An ISA cycle pending at on odd cycle preempts the CRTC address from the memory address lines, so the scan-out logic gets to see the data byte that was read or written by the processor instead of the data byte requested by the CRTC, and that's the snow you can get.

As the sync pulse width is derived from the CRTC character clock and not the pixel clock, the overly short sync pulses should occur on all modes except mode 0/1, of which only mode 1 needs a working color burst pulse. I think IBM didn't care about a broken color burst in mode 3, because 80-column text on composite output is a bad idea anyway. Mode 5/6 are without color burst anyway, but for mode 4 a working NTSC color burst is very much intended and it seems clearly like a bug if that mode has the broken short pulses.

Well, there are two facts I'm positive about:

1) Horizontal timing values as sent to the CRTC registers - in character units - are doubled only in 80-col modes 2/3, compared to the other modes (see IBM's "Options and Adapters" for CGA, p17). Going by your proposal, all modes would have to use the doubled values, except for modes 0/1 (40-col text).

2) The halved sync pulse only occurs in 80-column text mode. Graphics modes certainly don't suffer from it on my two different IBM CGA cards (or on others that I've heard of), neither in mode 4 which provides NTSC burst by default, nor in mode 6 where you can enable it for artifact-color purposes.
With the "bug or feature" question I referred only to 80-column text: true, it's not very readable in color, but there are use-cases for it, e.g. in ANSI-type games. IBM itself did document the existence of the pseudo-160x100 "graphics" mode, although not the fact that it's 80-col text mode under the hood.

I guess they wanted to push the XT clones out of the low-end market by providing an entry level system with increased performance (8086 instead of 8088) and improved graphics (256 colors!, 640x480!) as competitor. This would never have taken off (indeed, did it really take off?) if it wasn't software compatible with some already established graphics solution (VGA in this case), on the other hand, they couldn't fit the whole VGA thing on the mainboard. As you might be aware of, the "chipset" of the PS/2 model 30 does not consist of classic ASICs, but of mask-programmed gate arrays (two of them seem to make up the MCGA). The space in those chips is limited, so VGA might have needed more chips (imagine the complexity of the graphics controller!). And space for 8 RAM chips on the board.

That could be it - I believe it did take off, and probably better than IBM intended, at least early on. IIRC according to magazines, half (or more!) of the PS/2 line's sales around 1988 or so were for the low-end ISA models. Might have been only the Model 30 at that point. Probably due to both pricing and compatibility with existing add-on cards.

mkarcher wrote on 2020-09-30, 21:34:

That set you posted there is clearly not for the classic 8530. It expects an AT-like mainboard with an AT keyboard controller and AT CMOS RAM. Most likely, it belongs to a 8230-Exx, the Model 30/286.

You're right. My bad - I should really re-organize those folders properly. 😀 Having looked around a bit more, I now think that the 128K ROM sets only existed in the 286 and 386 machines, where the E000 segment mostly provided the "ABIOS" (i.e. IBM's proprietary extension which could be used in both real and protected modes, for OS/2 and such). Of course, none of that is spelled out in available IBM documentation - I had to look up a patent for that.
The thing is, apparently some MCA adapters with their own *ABIOS* extension ROMs could also be mapped to portions of E000. So it's possible that those additional fonts actually came from such an extension ROM... although they appear in both the model 30-286 and model 35-? dumps that I've located. I guess more research is pending.
[EDIT]: they also appear in new 25-286/30-286/25SX dumps that were recently uploaded here, so pretty definitely not an extension-ROM thing.

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 23 of 41, by mkarcher

User metadata
Rank l33t
Rank
l33t
VileR wrote on 2020-10-01, 15:03:
Well, there are two facts I'm positive about: […]
Show full quote

Well, there are two facts I'm positive about:

1) Horizontal timing values as sent to the CRTC registers - in character units - are doubled only in 80-col modes 2/3, compared to the other modes (see IBM's "Options and Adapters" for CGA, p17). Going by your proposal, all modes would have to use the doubled values, except for modes 0/1 (40-col text).

2) The halved sync pulse only occurs in 80-column text mode. Graphics modes certainly don't suffer from it on my two different IBM CGA cards (or on others that I've heard of), neither in mode 4 which provides NTSC burst by default, nor in mode 6 where you can enable it for artifact-color purposes.
With the "bug or feature" question I referred only to 80-column text: true, it's not very readable in color, but there are use-cases for it, e.g. in ANSI-type games. IBM itself did document the existence of the pseudo-160x100 "graphics" mode, although not the fact that it's 80-col text mode under the hood.

I need to do a 180 turn on this: I used to think that the 16-pixel character timing in graphics mode was HGC-only. I quick look into my book "Die Programmierung der EGA/VGA-Karte" ("programming the EGA/VGA card") that provides a chapter about CGA/HGC in an appendix, including BIOS defaults. As do the ROM dumps. Your facts are completely true. And you are correct that 160x100 graphics with 16 colors would be a valid use case of mode 3 on a composite monitor. You can't get correct timing, it seems, though. In standard mode, the CRTC outputs a 10 character wide sync pulse. In 80-column text mode, the CRTC would need to output a 20-character sync pulse, but the maximum width you can set is 15 or 16 (I'm too lazy to look into 6845 datasheets to find out whether "0" is interpreted as 16 or not).

VileR wrote on 2020-10-01, 15:03:
mkarcher wrote on 2020-09-30, 21:34:

That set you posted there is clearly not for the classic 8530. It expects an AT-like mainboard with an AT keyboard controller and AT CMOS RAM. Most likely, it belongs to a 8530-Exx, the Model 30/286.

You're right. My bad - I should really re-organize those folders properly. 😀 Having looked around a bit more, I now think that the 128K ROM sets only existed in the 286 and 386 machines, where the E000 segment mostly provided the "ABIOS" (i.e. IBM's proprietary extension which could be used in both real and protected modes, for OS/2 and such). Of course, none of that is spelled out in available IBM documentation - I had to look up a patent for that.
The thing is, apparently some MCA adapters with their own *ABIOS* extension ROMs could also be mapped to portions of E000.

The model 30/286 is no MCA machine, and the BIOS doesn't look like either the E segment is allowed to be swapped or the code in it is protected-mode aware. Probably that ABIOS-stuff is only in MCA-based machines. In your dump, the E segment contains (warning: that list might be incomplete): Handlers for INT11 (equipment flag), INT12 (memory size), INT 05 (print screen), Int 13 (Floppy) including IRQ6, Int 13 (hard-drive: bus-connected ESDI?) including IRQ14, Int 16 (keyboard) including IRQ1, Int 10 (VGA, not MCGA) including fonts and the parameter tables, some parts of INT 15 including IRQ12 (PS/2 mouse). The system would break down if that stuff suddenly was mapped out and replaced by option ROM areas.

Reply 24 of 41, by VileR

User metadata
Rank l33t
Rank
l33t
mkarcher wrote on 2020-10-01, 20:18:

And you are correct that 160x100 graphics with 16 colors would be a valid use case of mode 3 on a composite monitor. You can't get correct timing, it seems, though. In standard mode, the CRTC outputs a 10 character wide sync pulse. In 80-column text mode, the CRTC would need to output a 20-character sync pulse, but the maximum width you can set is 15 or 16 (I'm too lazy to look into 6845 datasheets to find out whether "0" is interpreted as 16 or not).

We had tested the latter question for the 8088 MPH 'calibrator'; it turned out that some revisions of the 6845 treated 0 as 16, and other's didn't. 😀 I think it was the later ones that didn't (and the compatible Hitachi HD6845 didn't either), but I may have that backwards. I later wrote a little program that applies the pulse-width "fix" as a TSR (or a booter), and for that one I used 15, as that's the maximum value guaranteed to be compatible across CRTC revisions.

Luckily, 15 seems to be enough for a satisfactory result. The CGA only uses smaller slices of the CRTC's total hsync period to generate (1) the actual hsync pulse sent to the display, and (2) the following NTSC burst signal. The default 20-character width leaves some room to spare before and after these two signals. A width of 15 characters may cut out a little bit of the burst period, but in practice most composite monitors/TVs seem to detect enough of it.
It probably depends on the display, but this still provides much better results than the older common practice of setting the overscan color to #6 (more info here).

mkarcher wrote on 2020-10-01, 20:18:

The model 30/286 is no MCA machine, and the BIOS doesn't look like either the E segment is allowed to be swapped or the code in it is protected-mode aware. Probably that ABIOS-stuff is only in MCA-based machines. In your dump, the E segment contains (warning: that list might be incomplete): Handlers for INT11 (equipment flag), INT12 (memory size), INT 05 (print screen), Int 13 (Floppy) including IRQ6, Int 13 (hard-drive: bus-connected ESDI?) including IRQ14, Int 16 (keyboard) including IRQ1, Int 10 (VGA, not MCGA) including fonts and the parameter tables, some parts of INT 15 including IRQ12 (PS/2 mouse). The system would break down if that stuff suddenly was mapped out and replaced by option ROM areas.

Thanks for having a look. Yes, I'm aware that there's no MCA in that machine, but I wasn't sure which models actually had the ABIOS in ROM - conceivably, it'd be useful in any 286/386-based PS/2, i.e. any of the models with protected-mode capability. Then again it's not really a necessity... apparently the only benefit is that it obviates the need for OS/2 to store equivalent routines in RAM.

As for the option ROM thing, that was based on some of the text in patent US5481709. If they were mapped to say E800, the only thing that would get overwritten is the bitmap data for those 5 extra fonts - which seem to be optional at best, if not completely unused. But that's all hypothetical.

Last edited by Stiletto on 2020-10-05, 04:23. Edited 1 time in total.

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 25 of 41, by Stiletto

User metadata
Rank l33t++
Rank
l33t++
VileR wrote on 2020-09-30, 13:02:

(Continuing the video-related discussion. Clearly off-topic by now; I'd ask a moderator to split it, but I know it's an annoying bunch of work). 😀

Did the grunt work and split the monitor discussion to here. 😀

Moved to Marvin -> Video.

"I see a little silhouette-o of a man, Scaramouche, Scaramouche, will you
do the Fandango!" - Queen

Stiletto

Reply 26 of 41, by VileR

User metadata
Rank l33t
Rank
l33t

Thank ye.

For now I've been looking a little deeper into that Model 30-286 (VGA, not MCGA) video BIOS. Mainly to satisfy my own curiosity about whether those character bitmaps at E000:8F00 onwards are ever used at all. I can't find any code with an obvious reference to that data, at least nothing as obvious as the references to the well-documented 8x8/14/16-line charsets. So these mystery-meat bitmaps may just be filler left there by a programmer who was feeling artistic.... although that feeling was not necessarily justifiable. 😉

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 27 of 41, by digger

User metadata
Rank Oldbie
Rank
Oldbie

Kudos for all the meticulous detective work, people. Much of the details are downright dizzying. 😅

So if I understand this correctly, you could hook up an IBM PS/2 model with MCGA graphics to a TV through RGB SCART, with a simple passive adapter cable, provided that the TV supported 60Hz vertical NTSC frequency video? Such as for instance most European Sony TVs? And most 256-color 320x200 DOS games would be displayed flawlessly on them, without having to do any register level voodoo to tweak the graphics card into outputting the right frequencies? Or did I misunderstand?

Reply 28 of 41, by VileR

User metadata
Rank l33t
Rank
l33t
digger wrote on 2020-10-05, 20:11:

So if I understand this correctly, you could hook up an IBM PS/2 model with MCGA graphics to a TV through RGB SCART, with a simple passive adapter cable, provided that the TV supported 60Hz vertical NTSC frequency video? Such as for instance most European Sony TVs? And most 256-color 320x200 DOS games would be displayed flawlessly on them, without having to do any register level voodoo to tweak the graphics card into outputting the right frequencies? Or did I misunderstand?

The guy who confirmed it with his Sony PVM replied on VCF (link): "In my experimenting it depends. Mode 13h depending on the game either stays in 15.6KHz, but some also switch it to 31.46KHz (These rates are from my oscilloscopes frequency counter, so it might be accurate or off slightly, but being a Tektronix from the 80's that hasn't been serviced in a long while, who knows.)"

I guess some of those games may be bypassing or tricking the video BIOS. Other than that, you have the other known issues with MCGA: the better 256-color 320x200 games actually require VGA (for unchained mode, speed tricks etc.), and 16-color games for either VGA or EGA aren't going to work at all, or they'll fall back to 4 colors (with a custom palette if you're lucky).
So any grand expectations should probably be curbed, but this 'out-of-the-box' 15.2kHz capability is more or less unique in the IBM world, and MCGA has more tricks up its sleeve than the "crippled VGA" or "extended CGA" PR suggests. 😀

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 29 of 41, by digger

User metadata
Rank Oldbie
Rank
Oldbie
VileR wrote on 2020-10-05, 21:30:

I guess some of those games may be bypassing or tricking the video BIOS. Other than that, you have the other known issues with MCGA: the better 256-color 320x200 games actually require VGA (for unchained mode, speed tricks etc.), and 16-color games for either VGA or EGA aren't going to work at all, or they'll fall back to 4 colors (with a custom palette if you're lucky).

Well, there were a fair number of 16-color games that did use the MCGA 256-color mode to show the 16-color graphics. I remember trying some games on the 8086-based IBM PS/2 Model 30s we had in the computer lab in my high school. I distinctly remember trying Indiana Jones and the Last Crusade (which did show 16 colors on MCGA, using the 256-color mode), and Ghostbusters II, which, to my disappointment, only ran in 4-color CGA mode. Sierra On-line even supported the MCGA 256-color mode in later revisions of their first generation AGI interpreter.

Also, from what I've read on-line, a large percentage of the 256-color VGA games (if not most of them) ran fine on MCGA, since they didn't use those advanced VGA-only features you mentioned.

But yeah, the lack of backwards compatibility with at least the 16-color 320x200 EGA mode was indeed a bit of a bummer.

Still, back in the day I would definitely have preferred MCGA over the even more limited graphics capabilities of the Olivetti M24 we had at home. On that beast, all games were limited to 4-color CGA mode.

Reply 30 of 41, by mkarcher

User metadata
Rank l33t
Rank
l33t
VileR wrote on 2020-10-05, 21:30:

The guy who confirmed it with his Sony PVM replied on VCF (link): "In my experimenting it depends. Mode 13h depending on the game either stays in 15.6KHz, but some also switch it to 31.46KHz (These rates are from my oscilloscopes frequency counter, so it might be accurate or off slightly, but being a Tektronix from the 80's that hasn't been serviced in a long while, who knows.)"

The clock select bit (according to the IBM PS/2 Model 25 technical reference manual, which is incomplete but helpful and can be found on the Internet) at CRTC index 10h, bit 4. Only the state "bit set = 25.175MHz" is documented. "bit clear = 14.318MHz" is undocumented, as well as all other stuff required to interface 15kHz monitors. VGA games might try to use CRTC index 10h as "Vertical Retrace start" and accidentally reprogram the completely incompatible MCGA CRTC (actually, IBM calls that part the "video memory controller gate array") to use the VGA dot clock.

Reply 31 of 41, by Jo22

User metadata
Rank l33t++
Rank
l33t++

Completely unrelated perhaps, but both classic video monitors and VGA CRT monitors disturb the VLF bands with their 15/31KHz noise.. 😀

Edit: Maybe not completely unrelated.
With a soundcard and a software you can run a simple SDR receiver.
With a sampling rate of 96 to 192 KHz you can quite a bit dive into the VLF bands (note nyquist theorem).
Time stations, SAQ, sub marines and thunderstorms on jupiter etc.
Provided, that your CRTs are turned of.
Otherwise, you'll see their noise in the spectrum. On the other hand, a SDR could also be used to check if the VGA noise is 15KHz/31KHz.. 😉

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 32 of 41, by Benedikt

User metadata
Rank Oldbie
Rank
Oldbie
VileR wrote on 2020-10-05, 21:30:
The guy who confirmed it with his Sony PVM replied on VCF (link): "In my experimenting it depends. Mode 13h depending on the gam […]
Show full quote
digger wrote on 2020-10-05, 20:11:

So if I understand this correctly, you could hook up an IBM PS/2 model with MCGA graphics to a TV through RGB SCART, with a simple passive adapter cable, provided that the TV supported 60Hz vertical NTSC frequency video? Such as for instance most European Sony TVs? And most 256-color 320x200 DOS games would be displayed flawlessly on them, without having to do any register level voodoo to tweak the graphics card into outputting the right frequencies? Or did I misunderstand?

The guy who confirmed it with his Sony PVM replied on VCF (link): "In my experimenting it depends. Mode 13h depending on the game either stays in 15.6KHz, but some also switch it to 31.46KHz (These rates are from my oscilloscopes frequency counter, so it might be accurate or off slightly, but being a Tektronix from the 80's that hasn't been serviced in a long while, who knows.)"

I guess some of those games may be bypassing or tricking the video BIOS. Other than that, you have the other known issues with MCGA: the better 256-color 320x200 games actually require VGA (for unchained mode, speed tricks etc.), and 16-color games for either VGA or EGA aren't going to work at all, or they'll fall back to 4 colors (with a custom palette if you're lucky).
So any grand expectations should probably be curbed, but this 'out-of-the-box' 15.2kHz capability is more or less unique in the IBM world, and MCGA has more tricks up its sleeve than the "crippled VGA" or "extended CGA" PR suggests. 😀

As long as the registers can be read back, a timer-triggered TSR should be able to bend everything back to 15kHz. This would also be useful for VGA cards in combination with a TV set.
Does anyone know whether an MCGA can be talked into displaying a 640x200 16-color mode?
Needless to say, it would not be EGA compatible, but it would still be interesting for everything that uses external graphics drivers.

Reply 33 of 41, by mkarcher

User metadata
Rank l33t
Rank
l33t
Benedikt wrote on 2020-10-16, 19:33:

As long as the registers can be read back, a timer-triggered TSR should be able to bend everything back to 15kHz. This would also be useful for VGA cards in combination with a TV set.

I'm able to test whether registers can be read back in some months in a hands-on experiment on a PS/2 model 30. I'm afraid they might very well not be readable. IBM squeezed the complete MCGA into two mask-programmed off-the-shelf gate array chip, and I bet they were happy about anything they didn't need to implement.

Benedikt wrote on 2020-10-16, 19:33:

Does anyone know whether an MCGA can be talked into displaying a 640x200 16-color mode?
Needless to say, it would not be EGA compatible, but it would still be interesting for everything that uses external graphics drivers.

I'm afraid it is most likely not possible. The IBM documentation that alludes to most of the capabilities (they even mention a "dot clock select" bit in clear, it't just they only document 1=25.175 MHz for VGA compatible timing, but omitted documenting 0=14.318 MHz for CGA compatible timings), at at no point they have anything about a possible graphics mode with 16 colors.

Reply 34 of 41, by reenigne

User metadata
Rank Oldbie
Rank
Oldbie

It'd definitely be an interesting exercise to see what registers are programmed to what values in various modes by the MCGA BIOS and to see what happens if you mix and match register values from different modes.

Reply 35 of 41, by Jo22

User metadata
Rank l33t++
Rank
l33t++
reenigne wrote on 2020-10-16, 22:07:

It'd definitely be an interesting exercise to see what registers are programmed to what values in various modes by the MCGA BIOS and to see what happens if you mix and match register values from different modes.

I think the same!
Would it be futile to run some of these older diagnostics program on the MCGA?
NSSI, for example, has the ability to detect hidden video modes (on VGA)..

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 36 of 41, by mkarcher

User metadata
Rank l33t
Rank
l33t
Jo22 wrote on 2020-10-17, 00:42:
I think the same! Would it be futile to run some of these older diagnostics program on the MCGA? NSSI, for example, has the abil […]
Show full quote
reenigne wrote on 2020-10-16, 22:07:

It'd definitely be an interesting exercise to see what registers are programmed to what values in various modes by the MCGA BIOS and to see what happens if you mix and match register values from different modes.

I think the same!
Would it be futile to run some of these older diagnostics program on the MCGA?
NSSI, for example, has the ability to detect hidden video modes (on VGA)..

I reverse engineered the IBM PS/2 model 30 BIOS. There are no hidden modes (i.e. calling INT 10h with a mode number except 00-06, 11 or 13 does not initialize anything in the MCGA card).

Comparing the parameters in the BIOS for Mode #6 (640x200x2) and Mode #13 (320x200x256), the differences are limited to the following registers:

  • CRTC register 9: Mode 6: Set to 01 (2 scanlines per character) for the 2-bank CGA addressing scheme; Mode 13: Set to 00 (1 scanline per character)
  • CRTC register 10h: Mode 6: Set to 18h, Mode 13: Set to 19h. According to IBM, that extra bit in Mode 13 means ("When set to 1, this bit indicates that mode 13 is selected").
  • CGA mode register (3D8): Mode 6: Set to 10h, Mode 13: Set to 00. According to IBM, the extra bit in mode 6 means "When set to 1, this bit selects mode 6, 640-by-200 double-scanned graphics.". Don't worry about "double-scanned": I'm quoting the PS/2 model 25 technical reference manual, and it very specifically mentions nothing about the 15kHz modes, although the main board and the usual BIOS versions do support it.
  • MCGA extended mode register (3DD): Mode 6: 00, Mode 13: 04. This bit is documented as "256 colors".

The IBM technical reference manual shows the complete register sets for 31.5kHz operation. Here are the corresponding parameters for 15.6kHz operation:

Index                  Modes         Name (from IBM tech ref)
0,1 2,3 4,5 6 13

00 37 37 37 37 37
01 27 27 27 27 27
02 2F 2F 2F 2F 2F
03 34 34 34 34 34
04 FF FF FF FF FF Vertical Total (in scan lines!)
05 04 04 04 04 04
06 C7 C7 C7 C7 C7 Vertical Characters(!) Displayed (in scan lines, again!)
07 E0 E0 E0 E0 E0 Start Vertical Sync (in scan lines!)
08 00 00 00 00 00 Reserved (original 6845: Interlace/Skew)
09 07 07 01 01 00 Char. scan lines
0A 06 06 00 00 00
0B 07 07 00 00 00
0C 00 00 00 00 00
0D 00 00 00 00 00
0E 00 00 00 00 00
0F 00 00 00 00 00
10 48 48 48 48 49 Mode Control
11 30 30 30 30 30 Interrupt Control
12 47 47 47 47 47 Char. Gen/Sync Pol
13 00 00 00 00 00 Char. Font Pointer
14 FF FF FF FF FF Char. to Load


Address Modes Name (from IBM tech ref)
0,1 2,3 4,5 6 13
3C6 FF FF FF FF FF PEL Mask
3D8 28 29 0A 18 08 CGA Mode Control
3D9 30 30 30 3F 30 CGA Border Control
3DD 01 01 01 01 05 Extended Mode Control

Names are ommitted where the register is the same in all modes and it seems to have the well-known 6845 name and function. Remarks in parenthesis are mine.

Differences between the documented well-known 31.5kHz operation and 15.6kHz operation (except for the obvious change in the normal CRTC counter registers): The extended mode control bit 0 is documented as "Reserved = 0", and is always set for 15.6kHz operation. Also, the character gen/sync pol register has bit 0 set for 15.6kHz operation whereas it is clear for 31.5kHz operation. Unsurprisingly, it is the hsync polarity register, which causes positive hsync when set. Two more bits are different between the 15.6kHz and the 31.5kHz operation modes, both in CRTC register 10h ("MCGA Mode Control"): Bit 6 is set for 15.6kHz. IBM explains it as "The inverse of this bit is used as the ninth bit of vertical compare circuits and must be set to 0.". For 200-line monitors, the vertical compare circuits don't need to exceed 255, so the bit is set in 15.6kHz modes. Bit 4 is clear in 15.6kHz modes. IBM explains it as "This bit selects the dot clock and must be set to 1." Obviously, this bit chooses 14.318MHz dot clock if cleared.

These tables (together with the 31.5kHz table in the IBM technical reference manual) should give a nice starting point for experimentation (like setting the mode 11 and mode 13 bit at the same time, or combining the CGA mode 4/5 or the mode 6 bit with one of the extended mode bits), but I would be surprised if there is hardware support for som 16-color graphics mode, as no documented mode needs that support. If they built in 16-color graphics, they could have added support for the PCjr modes.

Reply 37 of 41, by VileR

User metadata
Rank l33t
Rank
l33t

^ Excellent work! All the information you've presented in this thread could really use being compiled into a document somewhere.

Again, I marvel that they went through all the work of implementing extra firmware and hardware support for the lower scan rate, only to keep it completely undocumented. Perhaps a technical reference (if there is one) for that '7496 Executive Workstation' actually includes this information... or one of those other obscure IBM systems whose model numbers place them nearby.

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 38 of 41, by TheGreatCodeholio

User metadata
Rank Oldbie
Rank
Oldbie
VileR wrote on 2020-09-28, 16:09:
@mkarcher: gotcha - thanks for the details! The plot thickens... […]
Show full quote

@mkarcher: gotcha - thanks for the details! The plot thickens...

mkarcher wrote on 2020-09-28, 12:55:

I checked the two PS/2 Model 30 machines that are still in reach (we used to have three for some time) for BIOS versions. One of them has the 61X8937/61X8938 pair (stickers), whereas the other one has the 61X8939/61X8940 pair (custom marked TC53257P toshiba mask ROMs). I'm going to investigate whether I can find BIOS code responsible for the 15kHz output. Actually, looking at the PS/2 model 30 mainboards for BIOS versions, they looked really similar (if not identical, I'm going to compare if you want) to the mainboards the 8-bit guy showed in his video.

That is interesting - there was a recent post on the Vintage Computer Forums where a dump of the 61X8937/38 pair was provided (European Rev.1, 12/13/1987). 61X8939/40 are apparently Rev.2 although the date is earlier (02/05/1987); can already be found at http://ibmmuseum.com/BIOS/8530/.
I was hoping to find some listings or commented disassemblies floating around somewhere, like the older IBM PCs, but couldn't come up with any. I guess that figures, since this was exactly when IBM did a 180° on the whole "open architecture" thing. Might as well have a shot with IDA anyway.

If 8-bit guy's video can indirectly lead us to confirm the existence of undocumented 15kHz support on MCGA, it might be good for something after all. 😁

DOSBox-X developer here: I have a PS/2 model 25 with MCGA and I can confirm that booting the system without a monitor attached triggers this 15KHz mode very reliably. 640x480 is non-functional, all other modes act appropriately. I have some MCGA study notes here for DOSBox-X's machine=mcga emulation mode.

MCGA register dump of either state (all registers are readable and writeable unlike write-only CGA registers):
http://hackipedia.org/browse.cgi/Comput ... 0snapshots

Issue tracker (with notes):
https://github.com/joncampbell123/dosbox-x/issues/777

The only part I have not yet been able to figure out is how MCGA fonts are set by the BIOS, or even whether the font is changeable beyond 8x8 and 8x16 fonts (could be in ROM). Which means the PS/2 graphics book I have from 1987-ish (the ONLY book that mentions MCGA!) might be completely wrong about the part of video RAM where you can supposedly write the font RAM.

DOSBox-X project: more emulation better accuracy.
DOSLIB and DOSLIB2: Learn how to tinker and hack hardware and software from DOS.

Reply 39 of 41, by mkarcher

User metadata
Rank l33t
Rank
l33t
TheGreatCodeholio wrote on 2021-02-11, 19:15:
DOSBox-X developer here: I have a PS/2 model 25 with MCGA and I can confirm that booting the system without a monitor attached t […]
Show full quote

DOSBox-X developer here: I have a PS/2 model 25 with MCGA and I can confirm that booting the system without a monitor attached triggers this 15KHz mode very reliably. 640x480 is non-functional, all other modes act appropriately. I have some MCGA study notes here for DOSBox-X's machine=mcga emulation mode.

MCGA register dump of either state (all registers are readable and writeable unlike write-only CGA registers):
http://hackipedia.org/browse.cgi/Comput ... 0snapshots

Issue tracker (with notes):
https://github.com/joncampbell123/dosbox-x/issues/777

The only part I have not yet been able to figure out is how MCGA fonts are set by the BIOS, or even whether the font is changeable beyond 8x8 and 8x16 fonts (could be in ROM). Which means the PS/2 graphics book I have from 1987-ish (the ONLY book that mentions MCGA!) might be completely wrong about the part of video RAM where you can supposedly write the font RAM.

I already wrote something about font uploading, which is partly guesswork from BIOS code, and not tested in all details on real hardware yet.

Fonts on the MCGA are handled just like on CGA/MDA: They are stored in a dedicated 8-bit wide memory chip with non-multiplexed addresses. The font memory is addressed using 4 bits of the row scan counter, and 8 bits of the character code. The MCGA has 13 address bits on the font memory, so one spare bit is used for 512 character mode or switching between two fonts. The important difference between MCGA and CGA/MDA is that the classic cards use a ROM chip, whereas MCGA uses a RAM chip. As the interface around the memory chip is identical for ROMs and reading SRAMs, the hardware design is very similar.

This has interesting consequences for font upload: The font memory is connected only to the video logic, it is not on the bus at all. The low four address bits are hardwired to the 4 row counter bits from the CRTC in the video memory controller gate array. The next eight address bits are hardwired to an 8-bit latch that latches character codes from the VRAM. Thus you do not have random access to the font memory, and the font uploading scheme has to deal with it. What they do: The video memory controller gate array has a special mode in which it doesn't fetch font data from the font RAM, but instead asserts a write to the font RAM instead, while the attribute byte is on the internal font/attribute data bus. The remaining timing stays as it is in text mode. This means: While scanning out scan line 0 (the top scan line) of a character, in font upload mode, only the top scan line of characters can be changed. While scanning out line 1, only the second
scan line can be changed - and so on. Furthermore, as every character cycle fetches a new character byte that addresses the video ROM, every even-addressed byte fetched from video memory must contain the destination character code for one byte of font data, which is stored at the subsequent odd address. The read addresses for the VRAM are modified during font upload mode:
A15 is always zero. A14/A13 are the source font block number. A12 to A9 are the current character scanline counter (i.e. 0 to 15 in 8x16 mode, 0 to 7 in 8x8 mode), A8 to A1 is a running index, and A0 chooses between character code and font pattern. (In fact, not every byte is addressed, as the video memory is scanned out using the VRAM shift register).

In the suggested font upload mode, the font transfer happens during vblank - in three text rows (48 scanlines) in the blanking period. The CRTC counters just keep running (especially the scanline counter). Characters are only transferred during the horizontal display period, i.e. 80 characters per line. That's where the IBM number "The MCGA hardware can transfer 240 characters during vertical blanking" is from, as you get three scanlines of 80 characters. There is a register, which is documented as "number of characters to upload during vblank" (IIRC CRTC register 14h), which curiously is always set to 0xFF, which is *more* than 240... This register might be something like a count of characters still to transfer, experimentation is required. When you start a font upload, all 256 characters are uploaded when the count register is 240.

One question remains: How do you put the font data into video memory? Easy: The 64KB of MCGA video memory is mapped to A000-AFFF (and the second half from A800 to AFFF is mirrored to B800..BFFF for CGA compatibility). As font uploading only uses addreses with A15=0 (as I wrote above), fonts are loaded from the area at A000..A7FF, which is not mirrored into the B area and thus is not conflicting with the 32KB text buffer (that supports 8 pages of 80x25 text as EGA/VGA to, too). The four font blocks are at base address A000, A200, A400, A600. The first 256 words of each of these blocks contain the first scanline of each character, the next 256 words contain the second seconline of each character. The BIOS always orders the characters from 00 to FF, so A000:0 is always 00, A000:2 is always 01, to get the top scan line of character one loaded from A000:3 and so on.