VOGONS


Why was 15kHz support dropped from the VGA standard?

Topic actions

First post, by schadenfreude1

User metadata
Rank Newbie
Rank
Newbie

I've been searching the web to discover why IBM decided to drop support for 15kHz video signals and instead line-double them as a 31kHz signal, but I haven't found anything. Was this just a cost-savings measure? I can't imagine the savings would be significant for monitors that cost many hundreds of dollars back in the day, but please correct me if I'm wrong. And I am guessing new users of VGA monitors at the time didn't notice or didn't care that their low-res CGA and EGA games were now line-doubled on their new VGA displays — or that their new VGA games were displayed in a "fake" high resolution mode. If I'm not mistaken, the main competitors of DOS back then — think the Amiga, ST, and so on — all used a true low-res 200p or 240p display mode for their games, leaving the VGA DOS version of a multi-platform game as the only one rendered in fake high-resolution with less discernible scanlines. Maybe IBM saw the line-doubling as a "feature" and not a bug?

I'd love it if all VGA monitors could sync to 15kHz because then I could use one display for pretty much all my old consoles as well as DOS and Windows. And even back in the day, anyone with the right cables could've hooked their consoles to their VGA monitor for glorious RGB instead of the RF/CVBS nightmare we had to deal with here in the States.

Reply 2 of 63, by appleiiguy

User metadata
Rank Newbie
Rank
Newbie

With a 15Khz signal there is a timing limit. This limits vertical resolution. Assuming a standard vertical sync range of 50 to 60 Hz, a 15Khz signal can only display approx 250-300 lines.

Reply 3 of 63, by Jo22

User metadata
Rank l33t++
Rank
l33t++
schadenfreude1 wrote:

I've been searching the web to discover why IBM decided to drop support for 15kHz video signals and instead line-double them as a 31kHz signal, but I haven't found anything. Was this just a cost-savings measure? I can't imagine the savings would be significant for monitors that cost many hundreds of dollars back in the day, but please correct me if I'm wrong. And I am guessing new users of VGA monitors at the time didn't notice or didn't care that their low-res CGA and EGA games were now line-doubled on their new VGA displays — or that their new VGA games were displayed in a "fake" high resolution mode. If I'm not mistaken, the main competitors of DOS back then — think the Amiga, ST, and so on — all used a true low-res 200p or 240p display mode for their games, leaving the VGA DOS version of a multi-platform game as the only one rendered in fake high-resolution with less discernible scanlines. Maybe IBM saw the line-doubling as a "feature" and not a bug?

The Atari ST also had a hi-res mode, which usually required a separate monitor (35.7KHz, see chart).
Perhaps multi-sync monitors weren't so common around the time when VGA was young (as they required complex circuitry).
In fact, I never used one of these fancy monitors until the late 90s/early 2000s.
I used the analogue models without any OSD (on-screen display) before.
Everything was configured via knobs.

schadenfreude1 wrote:

I'd love it if all VGA monitors could sync to 15kHz because then I could use one display for pretty much all my old consoles as well as DOS and Windows. And even back in the day, anyone with the right cables could've hooked their consoles to their VGA monitor for glorious RGB instead of the RF/CVBS nightmare we had to deal with here in the States.

That's what television sets with SCART were for. 😉

Half-Saint wrote:

There are VGA monitors that support 15 kHz but they're hard to find.

Monitors compatible to the Atari ST could be used.
Found a small chart.

http://www.atari-wiki.com/index.php/Flat_Panel_Displays

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 4 of 63, by h-a-l-9000

User metadata
Rank DOSBox Author
Rank
DOSBox Author

It does need additional cirquits inside the monitor to support such a wide horizontal range. And if you were displaying the 200-300 lines on a high-resolution CRT you would get an ugly picture with black lines in between (the lines torn apart vertically).

1+1=10

Reply 5 of 63, by schadenfreude1

User metadata
Rank Newbie
Rank
Newbie
Half-Saint wrote:

There are VGA monitors that support 15 kHz but they're hard to find.

Yep, and they're awesome — but these are exceptions. The VGA standard doesn't say that monitors should accept low-resolution signals like this.

Jo22 wrote:

The Atari ST also had a hi-res mode, which usually required a separate monitor

But I'm willing to bet the majority of games for that system used the low-resolution mode, unlike the always line-doubled mode we had to use with the DOS versions of these games.

Jo22 wrote:

That's what television sets with SCART were for. 😉

Good luck finding one of these in the US!

appleiiguy wrote:

With a 15Khz signal there is a timing limit. This limits vertical resolution. Assuming a standard vertical sync range of 50 to 60 Hz, a 15Khz signal can only display approx 250-300 lines.

h-a-l-9000 wrote:

And if you were displaying the 200-300 lines on a high-resolution CRT you would get an ugly picture with black lines in between (the lines torn apart vertically).

And why is this a problem? Are you saying that 240p games displayed on an arcade monitor are ugly? If I boot up Monkey Island on an Amiga connected to a 1084, every horizontal line the video card outputs will be mapped to a single scanline on the monitor, and there will be discernible black lines in between them. But if I boot up Monkey Island on a DOS machine connected to a VGA monitor, every horizontal line the video card outputs will be mapped to two scanlines on the monitor, and the black lines between scanlines will be thinner. After years of playing 240p consoles and arcade games on arcade monitors and Sony PVMs and the like, I much prefer the true low-res display to the "fake high-res" display that VGA monitors give to low-res DOS games. Hence why I am wondering what the VGA creators were thinking when they removed true low-res support.

Reply 6 of 63, by NJRoadfan

User metadata
Rank Oldbie
Rank
Oldbie

VGA simplified monitors and reduced costs, period. IBM managed to make a display controller that was (mostly) backwards compatible with CGA and EGA software and supported the new VGA modes with a fixed frequency monitor. EGA was already a dual-sync standard (15.75khz and 21.8khz) and the monitor was somewhat complicated to support both modes. VGA would have broken backwards compatibility with old monitors since it used analog RGB, so it was a good time to drop the older sync standards as well.

Reply 7 of 63, by Jepael

User metadata
Rank Oldbie
Rank
Oldbie

I think it's most likely for cost reasons.

Standard VGA resolutions used only fixed 31.5kHz horizontal rate, even if the horizontal rate can be either 60 or 70 Hz. Normally even if games did use weird video modes, they kept the horizontal rate constant or the image would roll on standard IBM monitors.

EGA monitors had to use both 15.7KHz for CGA modes and 21.85kHz for EGA modes (350 lines), so it has to support two ranges.

schadenfreude1 wrote:

And I am guessing new users of VGA monitors at the time didn't notice or didn't care that their low-res CGA and EGA games were now line-doubled on their new VGA displays

Well I don't think that game backwards compatibility was the first thing on IBM's mind. Whoever were the first new users of VGA were businesses using productivity applications, not consumers&gamers.

Edit: oh were too slow typing this, I said many things that were the same.

Reply 8 of 63, by h-a-l-9000

User metadata
Rank DOSBox Author
Rank
DOSBox Author

> And why is this a problem?
Imagine you were IBM and had to sell this? It's pure eye cancer. The high resolution monitors require smaller beam diameter which will cause extra-large black lines for a low resolution mode.

1+1=10

Reply 9 of 63, by Jo22

User metadata
Rank l33t++
Rank
l33t++
schadenfreude1 wrote:
Jo22 wrote:

That's what television sets with SCART were for. 😉

Good luck finding one of these in the US!

Sorry, I forgot about this. Is this really still an issue today ?
I thought cheap multi-norm TVs literally flooded the markets all around the world. 😳
In fact, I haven't seen a TV set without SCART since the 90s. I think this was about the time when European CRT TVs
got the ability to do 50Hz/60Hz on the fly. Older 80s models had issues with this and didn't support NTSC colour coding
used via composite (I believe French models usually didn't support PAL/NTSC at all, but used 50Hz composite sync and RGB).
But even some of these older video monitors did display 60Hz in monochrome when you used the V-Sync knob.
And colour also worked fine on these old monsters when they got RGB signals via SCART.

Speaking of old monsters, I think older Commodore Amiga monitors did also have RGB on their SCART sockets.
Perhaps you can get one of them in the US. If not, you could also try to get some SCART-VGA converters
or VGA TV tuner boxes. They are quite common here and some of them do have SCART support.
The only downside is that they do conversion stuff and you will loose that 15KHz arcade look.
And you have to make sure it does support RGB, because S-Video and RGB can't co-exist on the SCART plug.
If the box does have a separate S-Video connector, chances are good that its SCART port is wired for RGB.
Oh and I should also mention that EURO SCART and Japan SCART do have a different pinout..

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 10 of 63, by BloodyCactus

User metadata
Rank Oldbie
Rank
Oldbie

Sony PVMs, some NEC multisync.. arcade monitors. they all did 15khz. amiga 1084s are in demand and high in price, good for arcade repair on the bench. the PVM's are awesome, often had all kinds of inputs, pal + ntsc etc..

--/\-[ Stu : Bloody Cactus :: [ https://bloodycactus.com :: http://kråketær.com ]-/\--

Reply 11 of 63, by schadenfreude1

User metadata
Rank Newbie
Rank
Newbie
NJRoadfan wrote:

VGA would have broken backwards compatibility with old monitors since it used analog RGB

I like you answer, but I don't understand this part. Can you explain further? And do you know how VGA monitors display 640×350 EGA signals? Are they doubled every other line or something?

h-a-l-9000 wrote:

The high resolution monitors require smaller beam diameter which will cause extra-large black lines for a low resolution mode.

There are CRT monitors that can display 240p signals all the way up to 1080i. Just draw thicker scanlines for low-resolution images.

Jo22 wrote:

Sorry, I forgot about this. Is this really still an issue today ?

Yep, but as BloodyCactus said above, there are alternatives. The easiest to acquire is a late 90s/early 2000s SDTV with component (YPbPr) inputs, then convert all your RGB sources to YPbPr with a transcoder. As for me, I'm the proud owner of a PVM.

Reply 12 of 63, by NJRoadfan

User metadata
Rank Oldbie
Rank
Oldbie

CGA and EGA cards output what is known as TTL or "Digital" RGB. The monitor had extra electronics to decode the signals into a limited number of colors. CGA supported 16 colors tops (RGB + Intensity pin = 4-bits), and EGA supported 64 colors (RGB plus intensity pin for each color = 6 bits). I'm sure it had its advantages somewhere (less electronics on display adapter?), but it had to go in favor of higher color palettes.

Reply 13 of 63, by 133MHz

User metadata
Rank Oldbie
Rank
Oldbie

I agree with the consensus that it was done to simplify the deflection subsystem design for cost reduction. Multisync monitors are considerably more complex (and thus costly) than their fixed-frequency counterparts - things like a better flyback transformer, variable B+ voltage, dynamic width/height adjustments are needed along with some logic to determine which scan rate is being used to electronically switch all those circuits appropriately. The EGA approach with the sync polarities while simple didn't leave much room for growth, it made sense to conceive VGA as a fixed frequency standard by double-scanning lower resolutions for backwards compatibility, keeping the then-costly high resolution monitor as simple as possible, but then the PC performance race took off with ever increasing video resolutions and color depths, so monitor technology had to keep the pace and with better & cheaper technology available fitting a whole microcomputer inside the monitor to determine the resolution by measuring sync rates on the fly and control a bunch of MOSFETs to drive the deflection circuits appropriately while keeping constant picture width/height among all video modes wasn't such a crazy idea anymore, it became to be expected from even the cheapest of monitors. But by then the lower end of the standard had already been set to 31 kHz horizontal by the original VGA standard, so hardly any display manufacturer made displays which could scan lower than that, because why bother when video cards aren't outputting anything lower than 31k?

Another thing to consider is that while we love our thick black spaces between video lines now, back then they were seen as the limitations of raster scan display technology by the general public, being able to see the line structure of images was perceived as a bad thing, reminiscing of interlaced analog television. I believe it is sensible to think that IBM's decision to double-scan on VGA was perceived by them to be an 'improvement' to low resolution imagery. Little did they know we'd be romanticizing low resolution video and its limitations decades later. 😵

With that said, if there are parallel universes, and in one of them IBM developed a VGA standard that kept backwards compatibility with MDA/CGA/EGA modes in their original sync timings, requiring the need for a more complex tri-sync monitor at the start (like some Japanese computer manufacturers actually did for their domestic market), but therefore setting the lower horizontal scan range to 15kHz for every multisync PC display ever made after that... I'd totally give everything to live in that universe. 🤣

Jo22 wrote:

Sorry, I forgot about this. Is this really still an issue today ?
I thought cheap multi-norm TVs literally flooded the markets all around the world. 😳

Unfortunately it is. Cheap multi-standard TVs came to the market way too late in the American continent, very close to the decline of CRT as a television display technology. SCART is non-existent in America and even some PAL territories (like the developing parts of Asia), we had to make do with composite, with S-Video being a high end luxury, and component video appearing way too late to make any real difference, and on top of that most TVs in NTSC-land without V-hold knobs won't sync to 50 Hz video nor display color in anything other than NTSC, even high end brands/models made way into the early-mid-2000s won't sync to anything other than 60 Hz NTSC - I have a high end 34" Panasonic CRT and have used similar Sony WEGAs and XBRs for the American market and they always display a rolling B&W picture when fed PAL/50Hz video, it's like they're done that way on purpose, for market segmentation or who knows what. Ironically the crappy cheap Chinese-made CRT TVs that flooded supermarkets a decade ago were much better on the multi-standard front, most of them having no problem with PAL-50 video at all while the 'real' 'quality' brands kept on rolling in B&W, but cheap or expensive you were still stuck with composite or S-Video at most. No RGB input on consumer sets, ever, and component started appearing too close to the demise of CRT so it saw very little use during its day, most non tech savvy people made the jump from composite on CRT directly to HDMI on LCD/Plasma.

If you want RGB perfection and/or enjoy some 50Hz games/demos the way they were intended to here in NTSC-land, you're in for quite the challenge.

http://133FSB.wordpress.com

Reply 14 of 63, by reenigne

User metadata
Rank Oldbie
Rank
Oldbie

Just to add one more snippet of information to this: although IBM didn't make a 15kHz-capable monitor that plugged into the 15-pin VGA port or a VGA card with a digital (9-pin port) output, the actual VGA chipset is capable of generating 15kHz signals. There is even a register bit to tell the DRAM controller to do 5 refreshes per scanline instead of 3 in order to cope correctly with the lower rate. I don't know of any commercial software that does this, but I have successfully generated NTSC composite output using an IBM VGA card. It would also be pretty straightforward to generate 15kHz ("arcade style") analogue CGA from a VGA card.

Reply 15 of 63, by SquallStrife

User metadata
Rank l33t
Rank
l33t
reenigne wrote:

There is even a register bit to tell the DRAM controller to do 5 refreshes per scanline instead of 3 in order to cope correctly with the lower rate. I don't know of any commercial software that does this, but I have successfully generated NTSC composite output using an IBM VGA card. It would also be pretty straightforward to generate 15kHz ("arcade style") analogue CGA from a VGA card.

I wonder if the Sega Teradrive does this? It has a composite output that can be used when in PC mode, and its video chip is definitely VGA.

VogonsDrivers.com | Link | News Thread

Reply 16 of 63, by reenigne

User metadata
Rank Oldbie
Rank
Oldbie
SquallStrife wrote:

I wonder if the Sega Teradrive does this? It has a composite output that can be used when in PC mode, and its video chip is definitely VGA.

I don't think so - http://nfggames.com/games/teradrive/ says that "The PC output cannot use the composite port." So I guess it's just for the Megadrive output.

Reply 17 of 63, by SquallStrife

User metadata
Rank l33t
Rank
l33t
reenigne wrote:
SquallStrife wrote:

I wonder if the Sega Teradrive does this? It has a composite output that can be used when in PC mode, and its video chip is definitely VGA.

I don't think so - http://nfggames.com/games/teradrive/ says that "The PC output cannot use the composite port." So I guess it's just for the Megadrive output.

That's wrong, I think.

https://www.youtube.com/watch?v=-Y5IfNGIp00

VogonsDrivers.com | Link | News Thread

Reply 18 of 63, by reenigne

User metadata
Rank Oldbie
Rank
Oldbie
SquallStrife wrote:

I think that's got to be the composite output of a CGA card, not the Teradrive's native composite - 8088 Domination won't generate colour output on a VGA card (composite or otherwise).