schadenfreude1 wrote:but I am talking about the hypothetical situation where VGA monitors would support the NTSC broadcast standard as well.
Did you never wonder why such monitors don't exist?
If we look at monitors/TVs from the broadcasting world, we see that higher resolution monitors tend to have resampling of low-resolution signals in hardware to improve image quality.
Just look at some of the last CRT TVs that were available, in the HD-era, which could take HD-ready signals.
schadenfreude1 wrote:...that hypothetical might not have been cost-effective for 1987, though here's an NEC monitor from 1987 that supports 15kHz/24kHz/31kHz (for 128,100 yen): http://homepage1.nifty.com/y-osumi/parts/pc-tv453n/
Question is: How does the 15kHz look?
I had an Eizo Flexscan 15" monitor back in the day, which did 15 kHz, but it was ugly, as I said.
schadenfreude1 wrote:And EGA monitors are multisync;
No, they aren't. You might want to get your facts straight before making such claims. It's damn annoying to have to correct basic stuff like this, which invalidates your entire argument to begin with.
EGA monitors aren't multisync, they're dual-sync.
They only support two hardwired modes, and the mode is selected by the polarity of the hsync pulse.
Multisync is completely different, and can basically support 'any' timing within a certain range. It automatically detects the signal and locks on to it.
Mind you, the VGA standard itself is not multisync. A standard VGA monitor only has to support the standard VGA modes, much like how EGA works.
Many VGA monitors are actually SVGA monitors, and use multisync for compatibility with a wide range of hardware.
schadenfreude1 wrote:I wouldn't call it a "design flaw" — more like a clever trick to our benefit.
A 'trick' implies that they actively designed something to make something possible.
But since they just use standard progressive scan NTSC-compatible signals, they're not doing anything special. Just using the CRT the way it was originally designed (for NTSC frames, except NTSC has even and odd frames, and hardware only outputs even frames).
schadenfreude1 wrote: The alternative would be to view interlaced video on a screen a foot away from your face, which would give me a headache.
Doesn't sound like you know much about how the hardware works.
You should try coding on Amiga sometime. It is one of the few computer systems that natively supports interlaced screenmodes. The headache you will be getting is that you have to design your game to take care of the differences in even and odd frame timings, scanline placement and that sort of thing.
The Amiga has a very powerful copper chip to help you with that (it supports two copperlists, so you can use separate even and odd copperlists). But that's a high-end 16-bit machine from 1985. With the technology from the late 70s or early 80s, it was far too complicated to make a circuit that supported interlaced graphics for games. Heck, early arcade machines barely even supported a framebuffer. It was all racing the beam.
schadenfreude1 wrote:By comparison, displaying interlaced video on televisions was more tolerable because we sat back a lot farther from them. Unfortunately, using the non-interlaced mode halves the resolution, but at least we get to keep our eyesight.
You are completely ignoring the fact that they used standard NTSC CRTs because they were commodity hardware.
In the professional world (eg CAD), monitors with much higher resolutions (and no interlacing) were available. They were just stupid expensive, so you couldn't put them into an arcade machine and hope to get a profit.
It's kinda like arguing that the C64 only had 64K of memory, where the Apple Lisa came with 1MB. Sure, the technology was there to put 1MB of memory in a machine. Problem is, the Lisa had a pricetag of $10000, and the C64 was meant to be affordable.
Like home computers of the era, arcade machines didn't represent the state-of-the-art of technology, and a lot of choices were made simply to cut costs and make them affordable enough for a wide audience.
You're just trying to romanticize arcade machines in a way that is far from realistic.