VOGONS


Reply 40 of 63, by Megadisk

User metadata
Rank Member
Rank
Member
SquallStrife wrote:

Edit: Checked it out and it's not a different part, BUT, the Intel 8042 is an 8-bit microcontroller, so IBM/Sega probably developed some custom code for it.

Yes is the same but like you said most likely with a different code. I desoldered it from the board to see if there was a way to dump it but never got around to. The Sega MD/Genesis controller only seems to work on the PC side with the "Puzzle construction" game as well as in the Teradrive menu. Also I should tell you that if you were to use a 6-button controller (with PZLCNST) you must hold the "MODE" button during power on (like in some Genesis/MD games) otherwise it will behave crazy all over the place.

Last edited by Megadisk on 2016-09-25, 23:45. Edited 1 time in total.

Reply 41 of 63, by Scali

User metadata
Rank l33t
Rank
l33t
schadenfreude1 wrote:

There are CRT monitors that can display 240p signals all the way up to 1080i. Just draw thicker scanlines for low-resolution images.

What do you think the scandoubling is for? 😀
You can't control the thickness of the CRT beam. It is what it is. You can however draw the same scanline multiple times.
That's probably what 1080i CRT monitors do internally (modern hardware has all sorts of fancy digital resampling that wasn't feasible back in 1987, so you had to fix it at the input).

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 42 of 63, by SquallStrife

User metadata
Rank l33t
Rank
l33t
Megadisk wrote:

The Sega MD/Genesis controller only seems to work on the PC side with the "Puzzle construction" game as well as in the Teradrive menu. Also I should tell you that if you were to use a 6-button controller (with PZLCNST) you must hold the "MODE" button during power on (like in some Genesis/MD games) otherwise it will behave crazy all over the place.

But that startup screen lets you assign keyboard keys to MD buttons (or vice versa, it's not clear), I wonder what the purpose is?

VogonsDrivers.com | Link | News Thread

Reply 43 of 63, by SquallStrife

User metadata
Rank l33t
Rank
l33t
MMaximus wrote:

Thanks for posting these pics Squallstrife, that Teradrive seems really cool. Is the Mitsubishi CRT dual Sync?

Sure is! 15kHz up to 70kHz or so.

It's not a great quality monitor though, anything over 800x600 is basically unreadable due to moire interference.

VogonsDrivers.com | Link | News Thread

Reply 44 of 63, by schadenfreude1

User metadata
Rank Newbie
Rank
Newbie

Wow, this thread got really derailed! I'll see if I can put it back on its tracks.

So our consensus at this point is that IBM decided to remove backwards-compatibility for cost reduction, simplicity, and marketing reasons. That all makes sense to me, but I'm surprised that, as far as my Google searches yield, I might be the only person on the Internet who has asked this question; even the Wikipedia page on VGA doesn't mention this topic. I'm wondering if anyone knows of a technical document or interview with the engineers who worked on the VGA standard or whatever that can provide the "official" reason. Hell, maybe one of you fine friends worked at IBM back in the day!

133MHz wrote:

With that said, if there are parallel universes, and in one of them IBM developed a VGA standard that kept backwards compatibility with MDA/CGA/EGA modes in their original sync timings, requiring the need for a more complex tri-sync monitor at the start (like some Japanese computer manufacturers actually did for their domestic market), but therefore setting the lower horizontal scan range to 15kHz for every multisync PC display ever made after that... I'd totally give everything to live in that universe. 🤣

Amen to that! And the rest of your post is excellent; thanks for writing it up! You reminded me that early VGA monitors couldn't scan up to 1600x1200 or higher like the circa-2000 and later CRT monitors I primarily use. I'm assuming the earliest late-80s VGA monitors only did 640x480, meaning they weren't multisync like their descendants would be. And yes, as you said, once these multiscan monitors became ubiquitous, no one bothered to add support for 15kHz signals because there was no demand for it. Man, if only console gamers had clamored for it at the time!

reenigne wrote:

Just to add one more snippet of information to this: although IBM didn't make a 15kHz-capable monitor that plugged into the 15-pin VGA port or a VGA card with a digital (9-pin port) output, the actual VGA chipset is capable of generating 15kHz signals.

Yep, this is definitely true. I've used ARCMON and VGATV to output a 15kHz signal to my PVM, but ARCMON displays a black screen when I boot into a game, and VGATV displays color glitches in the graphics when I play games. I really wish I could get gaming on my DOS machine to work with a PVM, but these TSRs are too hit or miss. I've been testing out using DOSBox with Soft15kHz instead, and so far the results are excellent. If IBM had read my mind from the future, I wouldn't have to go to these lengths!

Scali wrote:

Isn't the answer to the original question simply "image quality"?
VGA monitors had to be capable of 400 and 480-line modes. That meant they had to have quite a fine raster/small dot pitch. Which means if you were to actually feed such a monitor a 200-line signal, you'd clearly see the black spacing between the scanlines, and it would look horrible.

You mean like how you can see the black spacing when using a CGA monitor? or a 1084 with an Amiga? or any low-res arcade game in a cabinet? Just because the VGA standard has to support 400- and 480-line modes doesn't mean it couldn't have also supported a 240-line mode at 15kHz. The resulting image would look exactly like you were displaying it on a CGA monitor.

Maybe what you're getting at is that the increasing sizes of monitors would make the empty lines in low-resolution modes much more obvious. What I mean is that computer desk standards weren't changing — the distance between your face and the computer screen is relatively fixed — but the size of the monitors was increasing. So, given a fixed viewing distance, the bigger the screen size, the bigger and more obvious the gaps between the scanlines in a low-resolution mode. Maybe IBM saw the future of 14 inch and bigger monitors and decided to line-double the output rather than have users complain about being able to see the empty lines from their usual viewing distance — though I find it funny that in the 90s and later, people would game on 29" arcade monitors displaying 240p signals with very visible scanlines, their faces within a foot from the screen, and not mind it in the slightest. But I can see the marketing use for a line-doubled mode for business applications, and I assume that was IBM's primary market.

Scali wrote:

Nowhere near as bad as on a VGA monitor with its higher resolution.
An 1084 is only PAL/NTSC resolution, so only 256/200 vertical lines. VGA has 480 max. So the lines are much thinner.

But again, if IBM had decided to go with a multisync solution for VGA — kinda like how EGA monitors support both 21kHz and 15kHz signals — then the lines for a 15kHz signal would be as thick as they would appear on a CGA monitor, with no need for line-doubling.

Reply 45 of 63, by SquallStrife

User metadata
Rank l33t
Rank
l33t
schadenfreude1 wrote:

Wow, this thread got really derailed! I'll see if I can put it back on its tracks.

Not really, it's all related. Keep your pants on! 😀

schadenfreude1 wrote:

You mean like how you can see the black spacing when using a CGA monitor? or a 1084 with an Amiga? or any low-res arcade game in a cabinet? Just because the VGA standard has to support 400- and 480-line modes doesn't mean it couldn't have also supported a 240-line mode at 15kHz. The resulting image would look exactly like you were displaying it on a CGA monitor.

...

But again, if IBM had decided to go with a multisync solution for VGA — kinda like how EGA monitors support both 21kHz and 15kHz signals — then the lines for a 15kHz signal would be as thick as they would appear on a CGA monitor, with no need for line-doubling.

No. You're ignoring the bit where the scanning dot is much smaller.

VogonsDrivers.com | Link | News Thread

Reply 46 of 63, by Scali

User metadata
Rank l33t
Rank
l33t
schadenfreude1 wrote:

You mean like how you can see the black spacing when using a CGA monitor? or a 1084 with an Amiga?

That's different, because that's caused by using progressive scan on displays designed for interlaced images. There is 'some' black spacing, but it is not an entire scanline or more.
They didn't put these black lines in 'by design', it was just that stock PAL/NTSC CRTs were cheap to come by, and making a progressive scan video circuit was cheaper and easier to build than an interlaced one.
So it's a design flaw. VGA came with powerful enough hardware and good enough monitors to fix that flaw. So they did. Besides, if they didn't, it would have looked far worse (as I say, try it).

schadenfreude1 wrote:

The resulting image would look exactly like you were displaying it on a CGA monitor.

No, it would not. Try it.

schadenfreude1 wrote:

But again, if IBM had decided to go with a multisync solution for VGA — kinda like how EGA monitors support both 21kHz and 15kHz signals — then the lines for a 15kHz signal would be as thick as they would appear on a CGA monitor, with no need for line-doubling.

I don't think it would work that way. The CRT can only be designed for one type of beam as far as I know. So you either have a thin beam and a fine raster, or you have a thick beam and a coarse raster. But I don't think you can do both. At least, not with affordable 1987 technology.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 47 of 63, by SquallStrife

User metadata
Rank l33t
Rank
l33t
reenigne wrote:

Another composite CGA game to try is VileRancour's recent port of Commander Keen 4 to composite CGA.

The colours are a little washed out, but they're definitely there and "correct".

5ADpQiQh.jpg
qQxZzq6h.jpg
po2ZxGLh.jpg

VogonsDrivers.com | Link | News Thread

Reply 48 of 63, by Scali

User metadata
Rank l33t
Rank
l33t

Even if it was a clone CGA card, it would be quite good, colour-wise. I've seen various 'brand' CGA clones that aren't that close to real CGA in composite. In fact, an ATi Small Wonder isn't even close: https://youtu.be/eRVwYCq8X5w
But for VGA to have composite at all is nice (well, in the PCI-era it became somewhat common to have TV-out with s-video and/or composite), and to actually support CGA artifacting accurately is nothing short of amazing.
It looks like Sega finally fixed what was wrong in the original CGA design as well, and fixed the colour burst in 80-column mode as well.

And from the other images I understand that the composite signal also works for 320x200 EGA and VGA modes? Only hi-res modes like 640x350/640x480 aren't working?
That's a nice bonus. Shame that not more video cards of that era offered this functionality. It could have made the threshold for upgrading to VGA a lot lower. I recall that upgrading my XT to a VGA card and monitor cost stupid money. If I could just use a regular TV or PAL/NTSC compatible monitor, it could have saved a lot of money.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 49 of 63, by SquallStrife

User metadata
Rank l33t
Rank
l33t

The mobo was apparently designed by IBM. The VGA chip is a WD90C10.

And yep, VGA 480 line mode causes the composite monitor to stop working. Haven't tried 350-line mode (I'll get out Thexder later and try), but I presume it'd behave the same.

VogonsDrivers.com | Link | News Thread

Reply 50 of 63, by Scali

User metadata
Rank l33t
Rank
l33t
SquallStrife wrote:

The mobo was apparently designed by IBM. The VGA chip is a WD90C10.

Ah, a Paradise SVGA chip. That was a decent performer back in the day as well.

SquallStrife wrote:

And yep, VGA 480 line mode causes the composite monitor to stop working.

That would imply that some games using a 320x240 VGA tweakmode would also not work, because effectively that's a 480 line mode after scandoubling as well.
I can't think of any 286-compatible games that use that tweakmode though... But they probably exist.

Edit: By the way, I see that it (obviously) has a PSG chip for the MegaDrive-part. Do you have any idea if the chip can also be used in DOS mode? In theory they could have mapped it to the same address as PCjr/Tandy, and make it compatible that way (you could probably try that with a Sierra game, you should be able to configure VGA graphics and Tandy Audio).

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 51 of 63, by SquallStrife

User metadata
Rank l33t
Rank
l33t
Scali wrote:

That would imply that some games using a 320x240 VGA tweakmode would also not work, because effectively that's a 480 line mode after scandoubling as well.

Mode 13h games work pretty well (as I demonstrated earlier with Wolf3D), do you know of any Mode X "test" apps?

Scali wrote:

Edit: By the way, I see that it (obviously) has a PSG chip for the MegaDrive-part. Do you have any idea if the chip can also be used in DOS mode? In theory they could have mapped it to the same address as PCjr/Tandy, and make it compatible that way (you could probably try that with a Sierra game, you should be able to configure VGA graphics and Tandy Audio).

Good point! The level of integration between the two sides isn't well documented, so it's possible.

I'll go try some stuff now.

Edit: Neat aside, the PC Speaker sound is replicated on the RCA audio outputs. https://www.youtube.com/watch?v=RFJ6fbTAX74

Edit2: PSG is a no-go. Tried in SQ3 and with SBVGM.

OTOH, EGA 640x350 seems to work, of course the dithering is lost.

On RGB monitor:
LAD9EhDh.jpg
kJrS4K2h.jpg

On composite monitor:
PhLtIKBh.jpg
NjRrMoCh.jpg

VogonsDrivers.com | Link | News Thread

Reply 52 of 63, by Scali

User metadata
Rank l33t
Rank
l33t
SquallStrife wrote:

OTOH, EGA 640x350 seems to work, of course the dithering is lost.

That's interesting, it seems to perform some kind of vertical resampling to convert it to NTSC-compatible resolution.
You lose some resolution, but it's still usable.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 53 of 63, by schadenfreude1

User metadata
Rank Newbie
Rank
Newbie
SquallStrife wrote:

No. You're ignoring the bit where the scanning dot is much smaller.

Scali wrote:
schadenfreude1 wrote:

The resulting image would look exactly like you were displaying it on a CGA monitor.

No, it would not. Try it.

I think you're both misunderstanding me. Of course displaying this image on an existing VGA monitor would have very small scanlines, but I am talking about the hypothetical situation where VGA monitors would support the NTSC broadcast standard as well. In that case, the monitor would display 15kHz 200p signals in "double-strike" mode or "non-interlaced mode" or whatever you want to call it, just like how SDTVs display these sub-480i signals from video game consoles — and just like how CGA monitors would display them, thus maintaining backward compatibility.

But as you wrote here...

Scali wrote:

So you either have a thin beam and a fine raster, or you have a thick beam and a coarse raster. But I don't think you can do both. At least, not with affordable 1987 technology.

...that hypothetical might not have been cost-effective for 1987, though here's an NEC monitor from 1987 that supports 15kHz/24kHz/31kHz (for 128,100 yen): http://homepage1.nifty.com/y-osumi/parts/pc-tv453n/

And EGA monitors are multisync; were they cost-effective when they were released?

Scali wrote:

They didn't put these black lines in 'by design', it was just that stock PAL/NTSC CRTs were cheap to come by, and making a progressive scan video circuit was cheaper and easier to build than an interlaced one.
So it's a design flaw. VGA came with powerful enough hardware and good enough monitors to fix that flaw. So they did. Besides, if they didn't, it would have looked far worse (as I say, try it).

I wouldn't call it a "design flaw" — more like a clever trick to our benefit. The alternative would be to view interlaced video on a screen a foot away from your face, which would give me a headache. By comparison, displaying interlaced video on televisions was more tolerable because we sat back a lot farther from them. Unfortunately, using the non-interlaced mode halves the resolution, but at least we get to keep our eyesight. Seems like a fair deal! Plus the early video game consoles would have a hard time displaying graphics at that high resolution while maintaining a high frame rate. Maybe less graphically intensive business applications could handle the higher resolutions? I can't comment because I never used DOS for much more than gaming.

Reply 54 of 63, by Scali

User metadata
Rank l33t
Rank
l33t
schadenfreude1 wrote:

but I am talking about the hypothetical situation where VGA monitors would support the NTSC broadcast standard as well.

Did you never wonder why such monitors don't exist?
If we look at monitors/TVs from the broadcasting world, we see that higher resolution monitors tend to have resampling of low-resolution signals in hardware to improve image quality.
Just look at some of the last CRT TVs that were available, in the HD-era, which could take HD-ready signals.

schadenfreude1 wrote:

...that hypothetical might not have been cost-effective for 1987, though here's an NEC monitor from 1987 that supports 15kHz/24kHz/31kHz (for 128,100 yen): http://homepage1.nifty.com/y-osumi/parts/pc-tv453n/

Question is: How does the 15kHz look?
I had an Eizo Flexscan 15" monitor back in the day, which did 15 kHz, but it was ugly, as I said.

schadenfreude1 wrote:

And EGA monitors are multisync;

No, they aren't. You might want to get your facts straight before making such claims. It's damn annoying to have to correct basic stuff like this, which invalidates your entire argument to begin with.
EGA monitors aren't multisync, they're dual-sync.
They only support two hardwired modes, and the mode is selected by the polarity of the hsync pulse.
Multisync is completely different, and can basically support 'any' timing within a certain range. It automatically detects the signal and locks on to it.
Mind you, the VGA standard itself is not multisync. A standard VGA monitor only has to support the standard VGA modes, much like how EGA works.
Many VGA monitors are actually SVGA monitors, and use multisync for compatibility with a wide range of hardware.

schadenfreude1 wrote:

I wouldn't call it a "design flaw" — more like a clever trick to our benefit.

A 'trick' implies that they actively designed something to make something possible.
But since they just use standard progressive scan NTSC-compatible signals, they're not doing anything special. Just using the CRT the way it was originally designed (for NTSC frames, except NTSC has even and odd frames, and hardware only outputs even frames).

schadenfreude1 wrote:

The alternative would be to view interlaced video on a screen a foot away from your face, which would give me a headache.

Doesn't sound like you know much about how the hardware works.
You should try coding on Amiga sometime. It is one of the few computer systems that natively supports interlaced screenmodes. The headache you will be getting is that you have to design your game to take care of the differences in even and odd frame timings, scanline placement and that sort of thing.
The Amiga has a very powerful copper chip to help you with that (it supports two copperlists, so you can use separate even and odd copperlists). But that's a high-end 16-bit machine from 1985. With the technology from the late 70s or early 80s, it was far too complicated to make a circuit that supported interlaced graphics for games. Heck, early arcade machines barely even supported a framebuffer. It was all racing the beam.

schadenfreude1 wrote:

By comparison, displaying interlaced video on televisions was more tolerable because we sat back a lot farther from them. Unfortunately, using the non-interlaced mode halves the resolution, but at least we get to keep our eyesight.

You are completely ignoring the fact that they used standard NTSC CRTs because they were commodity hardware.
In the professional world (eg CAD), monitors with much higher resolutions (and no interlacing) were available. They were just stupid expensive, so you couldn't put them into an arcade machine and hope to get a profit.

It's kinda like arguing that the C64 only had 64K of memory, where the Apple Lisa came with 1MB. Sure, the technology was there to put 1MB of memory in a machine. Problem is, the Lisa had a pricetag of $10000, and the C64 was meant to be affordable.
Like home computers of the era, arcade machines didn't represent the state-of-the-art of technology, and a lot of choices were made simply to cut costs and make them affordable enough for a wide audience.
You're just trying to romanticize arcade machines in a way that is far from realistic.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 55 of 63, by Jo22

User metadata
Rank l33t++
Rank
l33t++
Scali wrote:

You're just trying to romanticize arcade machines in a way that is far from realistic.

By the way, that's something I do encounter quite often these days..
Back in the days, visible scanlines were considered an absolute no-go.
Several types of devices, like scan-doublers and flicker-fixer were sold to get away with this.

It goes even so far that former television consoles like the Genesis or NES get modded to look more arcade-like.
Nothing against RGB, but people tend to forget that game designers made use of limitations of composite video
to their advantage. These games were never meant to look "pixelated", their developers aimed for hi-quality graphics
as they do today. A lof of different dithering techniques were used to get away with colour limitations, too.
A popluar example was the checker board pattern used to create the illusion of transparency.
Another famouse example is the waterfall in Sonic 1. On RF or composite you can see transparency and a rainbow effect.
Or the wings of the Arwings in StarFox/StarWing on Super NES..

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 56 of 63, by schadenfreude1

User metadata
Rank Newbie
Rank
Newbie
Scali wrote:

If we look at monitors/TVs from the broadcasting world, we see that higher resolution monitors tend to have resampling of low-resolution signals in hardware to improve image quality. Just look at some of the last CRT TVs that were available, in the HD-era, which could take HD-ready signals.

Yes, sometimes, but not all the time. I used to own a Sony HD CRT, and it would display low-resolution signals with no discernible scanlines. But if you take any of the monitors in the Sony PVM/BVM line that support 15kHz as well as 31kHz signals, they display low-resolution images with visible scanlines. I've used one of these PVMs as well as a PVM that only accepts 15kHz signals, and both display low-resolution signals in the same manner.

Scali wrote:

EGA monitors aren't multisync, they're dual-sync.

Sorry, I used the term "multisync" because EGA monitors can sync to two different horizontal frequencies. I take it that because the vertical frequency is always 60Hz, this doesn't fit the definition of multisync?

Scali wrote:

You are completely ignoring the fact that they used standard NTSC CRTs because they were commodity hardware.

No, I was just saying that, given that we were using the NTSC standard for our computers and video game consoles, it's a good thing they used the low-resolution mode — whether required to by hardware restrictions or not — because even if the hardware were capable of displaying in the interlaced mode, the resultant flickering image wouldn't be as pleasant to look at when reading text from a short viewing distance.

Scali wrote:

You're just trying to romanticize arcade machines in a way that is far from realistic.

No. My bias is obvious, but the purpose of this thread is to determine why the VGA standard line-doubles low-resolution images. We've come to a consensus on this topic, but debating the artistic merits of this implementation is a separate discussion.

Reply 57 of 63, by schadenfreude1

User metadata
Rank Newbie
Rank
Newbie
Jo22 wrote:

Back in the days, visible scanlines were considered an absolute no-go.
Several types of devices, like scan-doublers and flicker-fixer were sold to get away with this.

This is interesting. I assume these were sold for computer users? Growing up, on all the CRTs I used for console gaming, I never noticed the empty lines because of the color blooming and lack of sharpness. Can you point me to some of these devices so I can read more about them?

Jo22 wrote:

It goes even so far that former television consoles like the Genesis or NES get modded to look more arcade-like.

I assume you mean add RGB support? The Genesis supports that natively, but yes, the NES requires a relatively pricey and invasive procedure to output RGB.

Jo22 wrote:
Nothing against RGB, but people tend to forget that game designers made use of limitations of composite video to their advantage […]
Show full quote

Nothing against RGB, but people tend to forget that game designers made use of limitations of composite video
to their advantage.
...
A popluar example was the checker board pattern used to create the illusion of transparency.
Another famouse example is the waterfall in Sonic 1. On RF or composite you can see transparency and a rainbow effect.
Or the wings of the Arwings in StarFox/StarWing on Super NES..

Definitely, but these tricks comprise only a small portion of the graphics in the games. Well, maybe for Star Fox the RF/composite smear would be good because the game's graphics look like shit. 😵 But in something like Sonic, would you rather have 5% of the game look awesome or 95%? I asked myself the same question before making the jump to RGB, but I can also appreciate viewing them in the same way I saw them when I was growing up.

Jo22 wrote:

These games were never meant to look "pixelated", their developers aimed for hi-quality graphics
as they do today.

I'd say on any CRT — except the aforementioned HD CRTs — the graphics won't look pixelated; rather, you'll see sharp lines if you're using an aperture grill or that soft honeycomb pattern if you're using a shadowmask, no matter what video connection cable you're using. The graphics will look pixelated if you use an LCD with a bad scaler or don't know what you're doing when you use an emulator on your computer. 😀

Reply 58 of 63, by Scali

User metadata
Rank l33t
Rank
l33t
schadenfreude1 wrote:

Sorry, I used the term "multisync" because EGA monitors can sync to two different horizontal frequencies. I take it that because the vertical frequency is always 60Hz, this doesn't fit the definition of multisync?

No, 'sync' in this case applies to both horizontal and vertical sync.
Then thing is that EGA has two hardwired modes, as I said. The monitor just 'knows' that when the vsync pulse is positive, it is a 200 line mode, and when it is negative, it is a 350 line mode.
Multisync is where the monitor actively samples the incoming signal and detects the correct horizontal and vertical sync frequencies by itself, not by some pre-defined 'trick' like with EGA.

schadenfreude1 wrote:

No. My bias is obvious, but the purpose of this thread is to determine why the VGA standard line-doubles low-resolution images. We've come to a consensus on this topic, but debating the artistic merits of this implementation is a separate discussion.

'Artistic merits' is one thing, but you were trying to spin it as if arcade machine designers actively went for a 'scanline' image because they thought it 'looked better'. Which is nonsense.
If anything, the artists made the best possible use of what tools they were given.

I suppose what works against VGA is that in the early days, PCs were not the primary target for game development. Most VGA games from the 80s and early 90s just had their graphics copied directly from Atari ST/Amiga. So no consideration was given to the extra colours that VGA could offer, or the different aspect ratio, or the different appearance of the pixels on screen, due to VGA monitors being very different from your average budget PAL/NTSC display used for Atari ST/Amiga.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 59 of 63, by Scali

User metadata
Rank l33t
Rank
l33t
schadenfreude1 wrote:

This is interesting. I assume these were sold for computer users? Growing up, on all the CRTs I used for console gaming, I never noticed the empty lines because of the color blooming and lack of sharpness. Can you point me to some of these devices so I can read more about them?

They were very common in the Amiga world, as I already mentioned.
Mainly for using a 'professional' SVGA monitor on a standard Amiga. Which became popular in the mid-90s for people who were still using their Amigas, and SVGA monitors had become cheap and commonplace. They were nice upgrades/replacements for the aging Amiga monitors.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/