leileilol wrote:Gemini000 wrote:Whether it's an aperture grille CRT or not makes no difference. Most modern CRT monitors do not physically support low resolution modes, so to compensate, they run a higher resolution, typically 720x400 or 640x480, and just double the number of pixels horizontally or vertically as necessary. This tends to create dark, horizontal lines through the displayed image. (AFAIK: Vertical with an aperture grille, not 100% certain though.)
Didn't this double-scan behavior start with VGA? This behavior isn't restricted to "modern" CRTs. You get these same lines in 640x400 too, infact it's the lines and comfortable 70hz that made me prefer 400 over 480 where available, and also alleviating some stretch artifacts in some games
Unless a CRT is actually digitally altering the signal before display (like most CRT HDTVs but not most CRT VGA monitors AFIK), there won't be any scaling going on as such.
Additionally, analog monitors support an infinite/indefinite number of possible horizontal resolutions, any analog monitor can display any number of horizontal pixels, it's just a matter of the dot clock used by the video source (ie even an old SDTV could produce 1280x480i/240p), though whether those pixels would be distinctly visible is another matter. (beam width, precision, and phosphor dot pitch are all practical limiting factors on all monitors -and the main reasons why some monitors display "sharper" than others)
Then there's also the issue of sync rates, and this is a real limit on analog monitors that will affect aspect ratio of the image and (more importantly) whether certain resolutions are supported at all (in terms of refresh rate and vertical resolution -number of scan lines). Horizontal sync rates define and limit the vertical resolution while vertical sync is the screen refresh
rate. (multi-sync monitors will still be limited by the specific sync-rates supported)
Additional information on the resolutions being used can also be sent to the monitors digitally via the VGA connector (if monitor and video card/vdriver support it), and this can provide automatic overscan and aspect ratio adjusment. (different resolutions may use different portions of the scan area) Though many monitors (or video hardware) still require manual scan adjustment.
Largely separate from these issues is the beam pitch itself (apart from the upper resolution limit mentioned above), and this is the cause for "black lines" or gaps between scanlines seen at low resolutions (many monitors show this even at 400 or 480 lines, and STVS almost always show this quite visibly when displaying non-interlaced images -ie 240p, common to older video game consoles and home computers). Unless a monitor has variable beam pitch, there's generally no practical way to avoid this problem in the analog domain. (digitally, line doubling could be used via a line RAM buffer, though this is much more commonly done on the video card end -short of more complex upscaling)
If the video card (or monitor) doesn't line double the image and the monitor doesn't support wider beam pitches, you will see noticeable black lines at low vertical resolutions. (OTOH, if fixed at wide beam pitch, high resolutions will appear blurrier)
This isn't a problem with supporting a specific resolution or sync rate, but the way the electron gun and screen are implemented. (a monitor explicitly supporting 320x200 in 15 kHz sync rate -like CGA, TVs, Amiga monitors, etc- would look identical to a 31 kHz VGA display if both used similar beam pitch and calibration -albeit, 15 kHz monitors are almost always specialized towards lower-res displays and thus tend to have broader beams by nature)
Another issue is with refresh rate of vertical sync: monitors often use low persistence phosphor to allow effective use of high refresh rates (high persistence screens will show motion blur), but conversely, the lower the persistence, the worse the flicker at lower refresh rates (or using interlaced modes for that matter -hence why most CRT TVs use high persistence phosphor). This has no impact on sharpness of the display, just flicker and clarity of motion/animation on-screen. (the old IBM Monochrome monitors are an extreme example of high-persistence phosphor -well beyond that of standard TVs . . . and rather odd given the high refresh rates used for MDA)