VOGONS


No$Gmb display oddities

Topic actions

First post, by RetroMaster137

User metadata
Rank Newbie
Rank
Newbie

Hello, this might be complicated, Im mostly and merely interested on the technical background of all this... Id like to understand, though not necessarily do something about it (for now)

Sometime between 2007 and 2008, my family had two "horizontal-kind" Dell computers featuring Pentium 3 CPU, I believe these were OptiPlex GX110 models (according to a photo search with Google, its the only exact match in chassis shape and such). Perhaps I could assume their hardware was the same, or at least similar, they came with the same RAM amount, prooobably the same cpu clock speed and so on; only the HDDs were apparently different, one having slightly more storage than the other. A particular quirk I happen to remember, was the capability of 24-bit color display on WinXP... The compatibility sucked for emulators and such, but I never seen 24-bit color ever again; I'm slightly worried only one of both PC was able to do it though (implying a major hardware difference)
But I digress, probably the only major difference were in their setup:

  • PC A featured an incredibly old CRT screen from the nineties, which couldnt display resolutions above 800x600 and... I could be wrong but it apparently was "fully analogic"? Like, if display resolution needed to change, it would do so immediately; it would NOT turn black for a while nor play noises nor anything, unlike every other CRT screen I ever seen in my life (I born in 1996).
  • PC B featured a more... "typical" CRT I guess... Supported at least 1024x768, would take its time to change between resolutions and, I remember getting surprised at seeing the screen's name and model within the display config menu from WinXP. And thats how I learnt about EDID.

Now on the topic of No$Gmb, it used to behave differently between the PCs.
For all I understand, the emulator initially displays in 30 lines (320x240) instead of 25 lines (320x200), and displays a preview of what is currently being emulated at the top-right corner of the debugger.
That only worked properly on 1 PC, not both. On the incompatible PC, 30 lines mode displayed as 25, the bottom of the debugger was cut-off, and it had no preview at the top-right corner. Bigger display modes did work though (yet without preview), but lack of proper 30 line support seemed very whimsical to me at the time. I dont know whether this was a problem on PC A or B though.
Full-screen emulation did work just fine on both computers though and I cant complain. But having the debugger work differently between them for seemingly no reason made no sense to me until today.

My questions are:
Would it been any different if I tried to run the emu under DOS rather than WinXP? Both were supossedly "Professional" editions, but only one would display the subtitle on boot time.
Would EDID play any role on this?
Is it possible that VGA capabilities/features were different across the computers?

Forgive my curiosity, and thanks for your time.

Reply 1 of 6, by RetroMaster137

User metadata
Rank Newbie
Rank
Newbie

Apparently I cant edit my post but, ESL, a grammatical clearification:
"I'm slightly worried only one of both PC was able to do it though (implying a major hardware difference)"
meant to be
"I dont remember and I'm slightly worried if only whether one of both PC happened to be capable to do it though (implying a major hardware difference)"

Reply 2 of 6, by jakethompson1

User metadata
Rank Oldbie
Rank
Oldbie
RetroMaster137 wrote on 2024-08-02, 10:34:
  • PC A featured an incredibly old CRT screen from the nineties, which couldnt display resolutions above 800x600 and... I could be wrong but it apparently was "fully analogic"? Like, if display resolution needed to change, it would do so immediately; it would NOT turn black for a while nor play noises nor anything, unlike every other CRT screen I ever seen in my life (I born in 1996).

I remember that too, also relates to XF86Config. One thing about those old monitors is they were quick enough at changing modes to show you the video BIOS copyright message while multisync ones were too slow and it slipped by before you could see it.

The issue is the oldest VGA monitors could only handle some discrete list of frequencies such as 25.175 MHz and 28.322 MHz, while multisync monitors could sync anywhere within some range, and that syncing took some time...

Reply 3 of 6, by mkarcher

User metadata
Rank l33t
Rank
l33t
jakethompson1 wrote on 2024-09-12, 01:49:
RetroMaster137 wrote on 2024-08-02, 10:34:

PC A featured an incredibly old CRT screen from the nineties, which couldnt display resolutions above 800x600 and... I could be wrong but it apparently was "fully analogic"? Like, if display resolution needed to change, it would do so immediately; it would NOT turn black for a while nor play noises nor anything, unlike every other CRT screen I ever seen in my life (I born in 1996).

The issue is the oldest VGA monitors could only handle some discrete list of frequencies such as 25.175 MHz and 28.322 MHz, while multisync monitors could sync anywhere within some range, and that syncing took some time...

You are mixing up some things. While the VGA pixel clocks numbers you quote are correct, CRT monitors do not care about the pixel clock. They do care about the sync frequencies, most prominently the horizontal sync frequency (which is 31.5kHz for VGA, both in 720-pixel modes at 28.322 MHz and in 640-pixel modes at 25.175 MHz). CGA monitors and TV sets, MDA/Hercules monitors and standard VGA monitors are "fixed frequency" monitors and limited one single horizontal frequency. The horizontal frequency is a key specification of monitors, because rapidly changing the magnetic field that deflects the beam from left to right involves moving around a significant amount of energy quite quickly, and deals with voltages around 1kV. This operation is implemented most easily when the whole circuit resonates at the desired frequency, so the energy moves mostly "by itself". Attentive readers might have noticed that a monitor between CGA and VGA was missing from the list. This is not an oversight: The EGA monitor is a dual-frequency monitor. It can sync either to ~15kHz or to ~21kHz. As far as I know, it uses some transistors to switch the behaviour of the horizontal deflection circuit based on a control signal that tells the monitor which mode it should operate in.

These monitors usually have a lot of potentiometers to tweak the oscillators for horizontal and vertical deflection. While dual-mode monitors might use a different set of potentiometers for some key settings (like image height), these monitors only have a limited amount of adjustability for different modes. They didn't need to. Those old graphics cards only had very few different timings, and once the monitor was adjusted to the timings of some graphics card, it would work in all modes.

This scheme worked quite well as long as the set of modes kept being limited. VGA had basically one mode (in three height variations), 1024x768 at 87Hz interlaced and 800x600 at 56Hz were sufficiently similar in timing that a second set of adjustments was enough. When people started using a greater variety of modes, monitors appeared that had the full geometry control (h-size, v-size, h-position, v-position) as user-controllable knobs, as these monitors were unable to have all modes a user might want to use perfectly adjusted. These monitors got annoying for users that were switching modes often - and that's when the monitors started to appear that blank the screen on mode changes: Those monitors have a microcontroller that measures the exact horizontal and vertical frequencies and uses these values to look up desired geometry settings in a table of pre-programmed modes and a second table of recently-used non-standard modes. The entries of these tables can be changed by the user interacting with some buttons on the front panel. Whenever this kind of monitor loses sync (on a mode switch), the processor blanks the screen, then measures the new frequency (which takes some time) and calculates parameters for the geometry-generating circuits. When the geometry correction parameters for the new mode have been established, the monitor waits for the oscillators to settle at the new frequency, and disables blanking as soon as the image is stable. These newer monitors are called "digitally controlled" monitors, and were standard equipment since around ~1994 in Western Europe and North America.

Some of those old monitors did make some noises (not play sounds) when modes were switched, for example when switching components in/out the circuit using electomechanical relays. One MDA monitor I have usually makes a ticking noise that excites some bell-like tone (I guess: The yoke is slightly banging agains the tube) whenever it has to re-synchronize because the Hercules card was re-programmed from text to graphics mode or vice versa. The operation of the Hercules card is that much different between these two modes that a mode switch always completely re-initializes the timing generation circuit. This needs to happen because in text modes, it counts 80 characters + blanking of 9 pixels each at 25 lines + blanking of 14 scanlines each, while in graphics mode, it counts 45 "characters" + blanking of 16 pixels each at 87 "rows" + blanking of 4 scanlines each, so the "current position" of the beam from text mode can not be carried over to graphics mode, even though the active period of the picture is nearly identical in both modes.

Reply 4 of 6, by jtchip

User metadata
Rank Member
Rank
Member
RetroMaster137 wrote on 2024-08-02, 10:34:

Sometime between 2007 and 2008, my family had two "horizontal-kind" Dell computers featuring Pentium 3 CPU, I believe these were OptiPlex GX110 models (according to a photo search with Google, its the only exact match in chassis shape and such). Perhaps I could assume their hardware was the same, or at least similar, they came with the same RAM amount, prooobably the same cpu clock speed and so on; only the HDDs were apparently different, one having slightly more storage than the other.

They may not both be the same system. From a quick search, the desktop version of the GX110 looks just like a GX1 (the text is incorrect, this is the regular desktop, not SFF). Both can be equipped with Pentium IIIs but the former is Socket 370 and uses the i810 IGP while the latter is Slot 1 and has an on-board ATI Rage Pro (the text mistakenly says Rage II when the picture clearly shows a Rage Pro). The different GPUs would explain the difference in behaviour observed:

RetroMaster137 wrote on 2024-08-02, 10:34:

Now on the topic of No$Gmb, it used to behave differently between the PCs.
For all I understand, the emulator initially displays in 30 lines (320x240) instead of 25 lines (320x200), and displays a preview of what is currently being emulated at the top-right corner of the debugger.
That only worked properly on 1 PC, not both. On the incompatible PC, 30 lines mode displayed as 25, the bottom of the debugger was cut-off, and it had no preview at the top-right corner. Bigger display modes did work though (yet without preview), but lack of proper 30 line support seemed very whimsical to me at the time. I dont know whether this was a problem on PC A or B though.

I don't why though other than to point out that at that time this was only Intel's second GPU (and first integrated GPU), its predecessor being the i740, whereas ATI had been making them since 1985.

Reply 5 of 6, by jakethompson1

User metadata
Rank Oldbie
Rank
Oldbie
mkarcher wrote on 2024-09-12, 20:00:

These newer monitors are called "digitally controlled" monitors, and were standard equipment since around ~1994 in Western Europe and North America.

That makes sense; last new low-end 14" monitor I remember with physical knob controls was around 1997 (bundled with the MB-8433UUD-based system actually)

Reply 6 of 6, by jtchip

User metadata
Rank Member
Rank
Member
jakethompson1 wrote on 2024-09-13, 23:39:
mkarcher wrote on 2024-09-12, 20:00:

These newer monitors are called "digitally controlled" monitors, and were standard equipment since around ~1994 in Western Europe and North America.

That makes sense; last new low-end 14" monitor I remember with physical knob controls was around 1997 (bundled with the MB-8433UUD-based system actually)

FWIW, I bought a new Samsung SyncMaster 3Ne, with physical h/v size/position knobs, in late 1998, in Western Europe.