VOGONS


First post, by keenmaster486

User metadata
Rank l33t
Rank
l33t

Please view the attached image to get an idea of what I am talking about. I captured that screenshot just now from DOSBox on my 1440p LCD monitor. It looks crisp and pleasant to the eye, no blur. The aspect ratio is displayed correctly rather than being stretched to the screen size. View it on a 1440p monitor at fullscreen, or zoom in to see what I mean.

I wish I could display this type of image from a DOS computer VGA source to the same monitor rather than having to use emulation.

Obviously everyone knows of the problems displaying low resolutions on LCDs, that they are basically unusable due to the blur introduced by scaling algorithms.

But with higher resolution displays such as 1440p or 4K, integer scaling actually looks good as each "pixel" is extremely close in size to the next compared to the odd ratios you would get with 1024x768 for example.

Is there a solution for this? Everywhere I look it seems like basically the same problems that were introduced by shortsighted engineers in the early 2000s have persisted across the entire LCD industry this whole time.

My new 1440p monitor does have an option to display the correct aspect ratio rather than stretching, which is nice. But it still displays the picture as blurry as possible, which could only have been programmed in there on purpose, presumably for no other reason than just to torture people who wish to display resolutions lower than native on this monitor, given that even 640x480 looks very nice, no glitchy pixels, with integer scaling as demonstrated in DOSBox.

Any monitors that don't do this? Scalers I can place in between the source and monitor that will accomplish what I am looking for? I know scalers exist for retro consoles, but haven't seen any for VGA that claim to do this.

Attachments

World's foremost 486 enjoyer.

Reply 1 of 21, by darry

User metadata
Rank l33t++
Rank
l33t++
keenmaster486 wrote on 2022-09-14, 19:16:
Please view the attached image to get an idea of what I am talking about. I captured that screenshot just now from DOSBox on my […]
Show full quote

Please view the attached image to get an idea of what I am talking about. I captured that screenshot just now from DOSBox on my 1440p LCD monitor. It looks crisp and pleasant to the eye, no blur. The aspect ratio is displayed correctly rather than being stretched to the screen size. View it on a 1440p monitor at fullscreen, or zoom in to see what I mean.

I wish I could display this type of image from a DOS computer VGA source to the same monitor rather than having to use emulation.

Obviously everyone knows of the problems displaying low resolutions on LCDs, that they are basically unusable due to the blur introduced by scaling algorithms.

But with higher resolution displays such as 1440p or 4K, integer scaling actually looks good as each "pixel" is extremely close in size to the next compared to the odd ratios you would get with 1024x768 for example.

Is there a solution for this? Everywhere I look it seems like basically the same problems that were introduced by shortsighted engineers in the early 2000s have persisted across the entire LCD industry this whole time.

My new 1440p monitor does have an option to display the correct aspect ratio rather than stretching, which is nice. But it still displays the picture as blurry as possible, which could only have been programmed in there on purpose, presumably for no other reason than just to torture people who wish to display resolutions lower than native on this monitor, given that even 640x480 looks very nice, no glitchy pixels, with integer scaling as demonstrated in DOSBox.

Any monitors that don't do this? Scalers I can place in between the source and monitor that will accomplish what I am looking for? I know scalers exist for retro consoles, but haven't seen any for VGA that claim to do this.

What model is your monitor ?

Have a look here for some photos of my results using an OSSC and a Philips 252B9 monitor (with an explicit 4:3 mode, which is required for maintaining proper asper ratio in 320x200 ):
Re: Widescreen monitors and 4:3 aspect ratio compatibility thread (photo comparison with an Geforce FX5900's scaling)
Re: I tested several pci video cards for picture quality in dos games. (photo of corner detail and direct capture of OSSC output)

Reply 2 of 21, by keenmaster486

User metadata
Rank l33t
Rank
l33t

OSSC seems to work rather well for this purpose. Maybe I will have to give it a try. How does it handle CGA/EGA modes and 70Hz?

World's foremost 486 enjoyer.

Reply 3 of 21, by maxtherabbit

User metadata
Rank l33t
Rank
l33t

it handles 70Hz just fine provided your monitor can handle 70Hz input, it is just a line doubler so refresh in = refresh out

CGA/EGA modes when output through a VGA card are no issue, connecting the OSSC to a real CGA/EGA card would require a TTL RGBI to analog RGB adapter

Reply 4 of 21, by Azarien

User metadata
Rank Oldbie
Rank
Oldbie
keenmaster486 wrote on 2022-09-14, 19:16:

Please view the attached image to get an idea of what I am talking about. I captured that screenshot just now from DOSBox on my 1440p LCD monitor. It looks crisp and pleasant to the eye, no blur.

"Pleasant to the eye" is subjective. On a CRT you never had this kind of sharp pixelated picture, there was always a bit of blurring and color bleeding.

I agree that monitor and GPU scaling algorithms could be better.

Reply 5 of 21, by keenmaster486

User metadata
Rank l33t
Rank
l33t
Azarien wrote on 2022-09-15, 15:41:

On a CRT you never had this kind of sharp pixelated picture

On a good sharp CRT at 320x200, the only artifacting you will see is the scanlines. I have a Dell CRT from the early 2000s that is still operating within spec and does actually look this sharp. That's what I'm going for.

My other CRTs do have some blur and artifacting that can't be adjusted away with the focus pot.

World's foremost 486 enjoyer.

Reply 6 of 21, by Jo22

User metadata
Rank l33t++
Rank
l33t++

But isn't VGA usually doubling 200 line modes?
320x200 should be 320x400 to the monitor.
So there's little left for thick scan lines?

On a normal "progressive" scan TV picture, there are visible scan lines, because it's kind of a cheat.

The older home computers (say Amiga)/consoles with composite do skip either the odd or even lines,
which also reduces flicker as a side effect.
The "scan lines" are essentially the odd or even lines that were not used (stayed black) by the video chip.

But it's no real progressive scan. If it was, both fields were used.
Either by line doubling (like VGA does) or by making use of the full resolution (without line hopping techniques à la interlacing).

- Of course, that's not possible with normal TV sets, the bandwidth would be much too high. Except for the line doubling thing, maybe (odd/even lines have same picture data).
But a VGA monitor could do both methods just fine.

Modern VGA monitors, I mean. From the mid-90s onwatds. They can do 15KHz/31KHz and both progressive/interlaced.
VGA's line doubling technique likely was used so that the early
VGA monitors didn't have to support 200 line modes and TV compatible timings.

Edit: Could this be the reason why MCGA didn't use anything besides 320x200, 640x200 (CGA) and 640x480?
A 320x240 resolution wouldn't be line doubled by VGA, would it?
A hacked mode, say, 320x199 mode (Jack Jazzrabitt?) wouldn't be line doubled, either, I suppose. 🤷‍♂️

Edit: Edited. Second time. Third time. Fourth..

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 7 of 21, by darry

User metadata
Rank l33t++
Rank
l33t++
Jo22 wrote on 2022-09-15, 18:06:
But isn't VGA usually doubling 200 line modes? 320x200 should be 320x400 to the monitor. So there's little left for thick scan l […]
Show full quote

But isn't VGA usually doubling 200 line modes?
320x200 should be 320x400 to the monitor.
So there's little left for thick scan lines?

On a normal "progressive" scan TV picture, there are visible scan lines, because it's kind of a cheat.

The older home computers (say Amiga)/consoles with composite do skip either the odd or even lines,
which also reduces flicker as a side effect.
The "scan lines" are essentially the odd or even lines that were not used (stayed black) by the video chip.

But it's no real progressive scan. If it was, both fields were used.
Either by line doubling (like VGA does) or by making use of the full resolution (without line hopping techniques à la interlacing).

- Of course, that's not possible with normal TV sets, the bandwidth would be much too high. Except for the line doubling thing, maybe (odd/even lines have same picture data).
But a VGA monitor could do both methods just fine.

Modern VGA monitors, I mean. From the mid-90s onwatds. They can do 15KHz/31KHz and both progressive/interlaced.
VGA's line doubling technique likely was used so that the early
VGA monitors didn't have to support 200 line modes and TV compatible timings.

Edit: Could this be the reason why MCGA didn't use anything besides 320x200, 640x200 (CGA) and 640x480?
A 320x240 resolution wouldn't be line doubled by VGA, would it?
A hacked mode, say, 320x199 mode (Jack Jazzrabitt?) wouldn't be line doubled, either, I suppose. 🤷‍♂️

Edit: Edited. Second time. Third time. Fourth..

VGA mode 13h is 320x200, line-doubled to 640x400 by the VGA card and displayed at 70Hz .

So called "hacked" modes are also line doubled (there may be exceptions)

My recollection of all VGA modes on my TTX monitor (with a .28mm dot pitch) was that they were quite sharp and without significant scanlines. A lot of my friends had way crappier monitors with blurry VGA, even 320x200 .

Reply 8 of 21, by Jo22

User metadata
Rank l33t++
Rank
l33t++
darry wrote on 2022-09-15, 18:52:
VGA mode 13h is 320x200, line-doubled to 640x400 by the VGA card and displayed at 70Hz . […]
Show full quote
Jo22 wrote on 2022-09-15, 18:06:
But isn't VGA usually doubling 200 line modes? 320x200 should be 320x400 to the monitor. So there's little left for thick scan l […]
Show full quote

But isn't VGA usually doubling 200 line modes?
320x200 should be 320x400 to the monitor.
So there's little left for thick scan lines?

On a normal "progressive" scan TV picture, there are visible scan lines, because it's kind of a cheat.

The older home computers (say Amiga)/consoles with composite do skip either the odd or even lines,
which also reduces flicker as a side effect.
The "scan lines" are essentially the odd or even lines that were not used (stayed black) by the video chip.

But it's no real progressive scan. If it was, both fields were used.
Either by line doubling (like VGA does) or by making use of the full resolution (without line hopping techniques à la interlacing).

- Of course, that's not possible with normal TV sets, the bandwidth would be much too high. Except for the line doubling thing, maybe (odd/even lines have same picture data).
But a VGA monitor could do both methods just fine.

Modern VGA monitors, I mean. From the mid-90s onwatds. They can do 15KHz/31KHz and both progressive/interlaced.
VGA's line doubling technique likely was used so that the early
VGA monitors didn't have to support 200 line modes and TV compatible timings.

Edit: Could this be the reason why MCGA didn't use anything besides 320x200, 640x200 (CGA) and 640x480?
A 320x240 resolution wouldn't be line doubled by VGA, would it?
A hacked mode, say, 320x199 mode (Jack Jazzrabitt?) wouldn't be line doubled, either, I suppose. 🤷‍♂️

Edit: Edited. Second time. Third time. Fourth..

VGA mode 13h is 320x200, line-doubled to 640x400 by the VGA card and displayed at 70Hz .

So called "hacked" modes are also line doubled (there may be exceptions)

My recollection of all VGA modes on my TTX monitor (with a .28mm dot pitch) was that they were quite sharp and without significant scanlines. A lot of my friends had way crappier monitors with blurry VGA, even 320x200 .

Early, lower end VGA monitors had a .52 dot pitch some times.
There are no visible scan lines.
That's how I VGA remembered, also.

I used to have a 14" IBM PS/2 monitor. 1987 era technology. Very blurry/facette like.
But it also masked the pixelation of QVGA resolution games, making them bearable to the eyes. 😉

I remembered this when I watched this video from ~5 hours ago, btw.
Hence my comments here. 😅

https://www.youtube.com/watch?v=m79HxULt3O8

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 9 of 21, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie

I'm not familiar with all the little minutia of technical issues that cause modern LCDs not to work well with vintage systems (and through DVI ports)... Maybe one day there will be some inexpensive solution for everyone out there. Until then, I just use a secondary 4:3 ancient 15" LCD from 2002 and that works just fine, even for 720x400 which never displays in 4:3 on my 27" BenQ or Acer LCDs, despite having aspect ratio selected in the menu (and obvisously I am always outputting VGA and never DVI).

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 10 of 21, by darry

User metadata
Rank l33t++
Rank
l33t++
Jo22 wrote on 2022-09-15, 20:44:
Early, lower end VGA monitors had a .52 dot pitch some times. There are no visible scan lines. That's how I VGA remembered, also […]
Show full quote
darry wrote on 2022-09-15, 18:52:
VGA mode 13h is 320x200, line-doubled to 640x400 by the VGA card and displayed at 70Hz . […]
Show full quote
Jo22 wrote on 2022-09-15, 18:06:
But isn't VGA usually doubling 200 line modes? 320x200 should be 320x400 to the monitor. So there's little left for thick scan l […]
Show full quote

But isn't VGA usually doubling 200 line modes?
320x200 should be 320x400 to the monitor.
So there's little left for thick scan lines?

On a normal "progressive" scan TV picture, there are visible scan lines, because it's kind of a cheat.

The older home computers (say Amiga)/consoles with composite do skip either the odd or even lines,
which also reduces flicker as a side effect.
The "scan lines" are essentially the odd or even lines that were not used (stayed black) by the video chip.

But it's no real progressive scan. If it was, both fields were used.
Either by line doubling (like VGA does) or by making use of the full resolution (without line hopping techniques à la interlacing).

- Of course, that's not possible with normal TV sets, the bandwidth would be much too high. Except for the line doubling thing, maybe (odd/even lines have same picture data).
But a VGA monitor could do both methods just fine.

Modern VGA monitors, I mean. From the mid-90s onwatds. They can do 15KHz/31KHz and both progressive/interlaced.
VGA's line doubling technique likely was used so that the early
VGA monitors didn't have to support 200 line modes and TV compatible timings.

Edit: Could this be the reason why MCGA didn't use anything besides 320x200, 640x200 (CGA) and 640x480?
A 320x240 resolution wouldn't be line doubled by VGA, would it?
A hacked mode, say, 320x199 mode (Jack Jazzrabitt?) wouldn't be line doubled, either, I suppose. 🤷‍♂️

Edit: Edited. Second time. Third time. Fourth..

VGA mode 13h is 320x200, line-doubled to 640x400 by the VGA card and displayed at 70Hz .

So called "hacked" modes are also line doubled (there may be exceptions)

My recollection of all VGA modes on my TTX monitor (with a .28mm dot pitch) was that they were quite sharp and without significant scanlines. A lot of my friends had way crappier monitors with blurry VGA, even 320x200 .

Early, lower end VGA monitors had a .52 dot pitch some times.
There are no visible scan lines.
That's how I VGA remembered, also.

I used to have a 14" IBM PS/2 monitor. 1987 era technology. Very blurry/facette like.
But it also masked the pixelation of QVGA resolution games, making them bearable to the eyes. 😉

I remembered this when I watched this video from ~5 hours ago, btw.
Hence my comments here. 😅

https://www.youtube.com/watch?v=m79HxULt3O8

0.52mm dot pitch ? Wow! That must have been something ugly to behold. On that note, I just saw this video pop into my recommendation list https://www.youtube.com/watch?v=m79HxULt3O8 It is about the 0.52mm dot pitch Tandy VGM-225 . I will have to watch that .

I remember seeing some IBM 8512 monitors (0.41mm dot pitch according to http://ps-2.kev009.com/pcpartnerinfo/ctstips/ee1a.htm ) in action and, TBH, while they weren't great, they did not seem that bad.

Reply 12 of 21, by SScorpio

User metadata
Rank Member
Rank
Member

There are inexpensive VGA->HDMI some people have luck with. But I haven't gotten them to work. I bit the bullet and got an OSSC which works great.

However; the OSSC wouldn't be outputting 1440p. With a 240p signal you can get a 3x scale to 720p, but higher resolutions will max out at 2x.

There is an OSSC Pro in the works but it's unknown when it will releases, it was announced Jan 2020. And it's expected to cost between $300-500 USD. It will supposedly scale up to 4K.

I also have a GBS Control I mess around with. But I haven't gotten that to work with my PCs either.

Reply 13 of 21, by Jo22

User metadata
Rank l33t++
Rank
l33t++
darry wrote on 2022-09-16, 02:51:

I remember seeing some IBM 8512 monitors (0.41mm dot pitch according to http://ps-2.kev009.com/pcpartnerinfo/ctstips/ee1a.htm ) in action and, TBH, while they weren't great, they did not seem that bad.

I'm not 100% sure, but that looks like the monitor I had.
There are no visible scan lines and the picture is slightly blurry.
Looking back, it was perhaps the cause for 16c graphics to look not so pixelated.
If dithering (say, 16c VGA driver) was used, it wasn't do obvious.

https://www.youtube.com/watch?v=4KpSCNXpJDU

^I must admit that the picture quality of this model shown in the video is higher than what I remember.
I think my monitor had a cheaper CRT, maybe. Or wasn't correctly adjusted, anymore.

keenerb wrote on 2022-09-16, 03:31:

My first VGA monitor was .52 dot pitch, and it was pretty bad, but it was a step up from the CGA I was using...

Cool! And in colour, I guess? I can relate, I think.
Back in the 90s, when I got my 286, I had the choice between two little VGA monitors.
The PS/2 monitor (aka the ugly) and a pretty looking one which was monochrome.
In the end, I ended up with the PS/2 monitor. My father convinced me not to get the monochrome model.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 14 of 21, by keenmaster486

User metadata
Rank l33t
Rank
l33t

I purchased a Gefen VGA->DVI scaler. It works, but its scaling algorithm is trash (pixels both fuzzy AND misshapen, somehow) and won't output 1440p, only 1080p, so it can't produce sharp uniform-for-all-practical-purposes pixels from 320x200 anyway.

Back to the drawing board. Maybe I wait for OSSC Pro to come out which supposedly has 1440p output.

World's foremost 486 enjoyer.

Reply 15 of 21, by darry

User metadata
Rank l33t++
Rank
l33t++
keenmaster486 wrote on 2022-09-19, 18:28:

I purchased a Gefen VGA->DVI scaler. It works, but its scaling algorithm is trash (pixels both fuzzy AND misshapen, somehow) and won't output 1440p, only 1080p, so it can't produce sharp uniform-for-all-practical-purposes pixels from 320x200 anyway.

Back to the drawing board. Maybe I wait for OSSC Pro to come out which supposedly has 1440p output.

If you want to realistically simulate what an OSSC (non pro) could do for your current setup you could

a) take a 320x200 screen capture from a VGA game of your choice using DOSBox, for example. A video clip would work too
b) Use nearest neighbor "upscaling" (essentially line multiplying or integer scaling) to convert the above to 1280x800 resolution (EDIT: using, for example, a paint program like Gimp, Photoshop, PaintShop Pro or maybe even MS Paint )
c) Use a modern PC running Windows or Linux to generate a custom resolution at 1280x800@70Hz an switch to it (IMPORTANT :make sure to disable any and all scaling on the GPU)
d) Display/playback the file generated in step b in full screen (set monitor aspect ratio to 4:3, if possible)

Doing the above would allow you to see

1) How your monitor handles scaling from 1280x800 to 2560x1440 and how sharp (0r not) it would look with an OSSC (non pro) if you had one .
2) Whether your monitor can actually display resolutions with non-square pixels in 4:3 aspect ratio. Many modern monitors cannot do this , that is they can only display in 4:3 aspect ratio resolutions whose (number of x pixels)/(number of y pixels)=4/3 .

EDIT: Corrected nonsense typos. I should really lay off the cabernet when posting. 😉

Reply 16 of 21, by keenmaster486

User metadata
Rank l33t
Rank
l33t

Interesting. I will have to see if I can try that.

Edit: Okay, so I just set my resolution to 1280x800 to see what would happen and it displays with native 16:10 aspect ratio, i.e. square pixels. That's not going to cut it.

I need something that scales to 4:3.

World's foremost 486 enjoyer.

Reply 17 of 21, by Sphere478

User metadata
Rank l33t++
Rank
l33t++

Following.

I’d like to find a simple active cable that lets me use dos to 1600x1200/1920x1080 vga to hdmi

Quality of image would be nice but not gonna be super picky about it.

Sphere's PCB projects.
-
Sphere’s socket 5/7 cpu collection.
-
SUCCESSFUL K6-2+ to K6-3+ Full Cache Enable Mod
-
Tyan S1564S to S1564D single to dual processor conversion (also s1563 and s1562)

Reply 18 of 21, by darry

User metadata
Rank l33t++
Rank
l33t++
Sphere478 wrote on 2022-09-20, 15:58:

Following.

I’d like to find a simple active cable that lets me use dos to 1600x1200/1920x1080 vga to hdmi

Quality of image would be nice but not gonna be super picky about it.

If your monitor supports forcing arbitrary resolutions to 4:3 , you wouldn't need to have the active cable be able to do it (no idea if any of them can). Of course, if you don't mind having a stretched image (wrong aspect ratio) in your use case, you don't need to care about that.

The active cable would need to support 70Hz input as well.

IMHO, that's a tall order, but maybe something like this exists. There was a thread the mentioned some options, AFAICR. Again AFAICR, none of them had aspect ratio control. I will try to find that.

EDIT: This thread --> What are the best Vga to hdmi scalers or peripherals for MS-DOS games ?

Reply 19 of 21, by Sphere478

User metadata
Rank l33t++
Rank
l33t++

My 4k tv (my monitor) can display 800x600, 640x480, 1024x768, 1024x1280 just fine. Over dvi to hdmi in ME and XP

Sphere's PCB projects.
-
Sphere’s socket 5/7 cpu collection.
-
SUCCESSFUL K6-2+ to K6-3+ Full Cache Enable Mod
-
Tyan S1564S to S1564D single to dual processor conversion (also s1563 and s1562)