To my understanding, the flyback transformer in a video monitor must be kept in resonance. Otherwise, it heats up and breaks (it has a wire thin as a hair, due to being voltage operated).
So in order to use a video signal with true 200 or 240 lines (progressive), the timings must be different to NTSC standard. Approx. 7 KHz horizontal frequency?
Edit: My bad, 7 KHz wasn't correct. 60 Hz by 200 lines would be 12 KHz (arithmetically).
That is, if the signal was natively supported, without any borders. Like in a custom built terminal device.
Edit: I remembered it. What I was thinking of was the relationship between the sawtooth generator
(gets triggered by signal blacker-than-black-, controls the deflection frequency, signal retrived via VBS) and the nature of the rasterization of the CRT image.
To my understanding, its signal's timings were made with interlaced video in mind (480i/576i).
So depending on the point of view, a traditional, analog TV or video monitor does kind of "expect" an interlaced signal at some point.
If being fed with a "pseudo" progressive scan signal (240p/288p), the raster has "dead" lines (missing lines).
In order to attempt to display such a signal "natively" (real progressive scan, no fields, no missing lines),
the sawtooth generator signal had to be adjusted accordingly, resulting in a squashed screen.
To avoid that, the size of the CRT's picture element also had to be equally enlarged, for compensation.
The focus also had to be slightly adapted, maybe. Some voltages mudt be adjusted, deflection magnets, too (for linearity). .
Essentially, it's like tuning an oscilloscope accordingly. Ironically, the scope itself also was used to primarily adjust a TV.
Edit: Rediscovered that story about FCC vs 240p, too.
https://www.digitalfaq.com/forum/video-captur … .html#post17658
"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel
//My video channel//