First post, by Jo22
- Rank
- l33t++
Hi everyone,
Just found and interesting read!
https://virtuallyfun.com/wordpress/2014/04/14 … rt-ii-tenoxvga/
It's about the ECL signaling method that old high definition "VGA" CRT monitors supported.
The signaling was used on some Atari ST/TT computer systems,
in order to provide a high-quality signal.
In case you don't know, the Atari ST line had a high resolution mode in monochrome from the very start.
It ran in 640x400 resolution @72Hz/35.7 KHz, and unlike the Commodore Amiga video modes, was based on a serious digital signal.
Note that there's a small, but important difference between binary and digital, that most people don't get.
Binary is a power of two (0/1), but digital can also contain many values.
Digital comes from Latin "digitus", meaning "finger".
Thus it means "fingered". A fork is fingered, like is a comb.
Essentially, it's about defined or pre-defined values.
Now back to the SM-124 monochrome monitor of the Atari ST.
It used a digital video input, too, which means it could only handle levels in a defined range.
Depending on the levels (two, since the graphics are mono), a pixel becomes black or white.
However, an ordinary VGA monitor can accept any values, including the "fingered" ones.
The drawback, however, is that noise can sneek in. Values of gray can appear, because no filtering is done inside the VGA monitor.
It's a bit comparable to the Hercules graphics on PC/XTs.
With the difference, that TTL monitors have an additional intensity pin.
Timings and polarity may also be different, of course.
Best regards,
Jo22
"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel
//My video channel//