VOGONS


Reply 240 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
bestemor wrote on 2020-09-04, 22:39:

So I take it the Extron EDID emulator (DVI) is not recommended or significantly useful in any way ?
At least not if you get that '300'/'300 A' scaler ?

The general answer to that is yes . No need to get an EDID emulator if you are going to use the Extron scaler .

However, I strongly recommend using a VGA card that does not upscale over VGA if you are going to use any external scaler, including the Extron or OSSC connected to said VGA card .

In other words, if your monitor's VGA input,when directly connected, shows the resolution received from your VGA card as 1920x1080, you do not want to use that card with an external scaler. A specially programmed EDID emulator MIGHT help here, but you are liklely going to go through a world of hurt trying to get it to work and give decent picture quality.

My recommendation is the Extron scaler, your current monitor and a card that does not upscale over VGA under DOS (FX5900 family is probably your best bet and should work fine for anything up to 2001-200, software wise) if you can live without >60Hz and want to keep costs low .

Reply 241 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++

I managed to make a video capture of the OSSC in action in Doom (320x200@70Hz line doubled to 640x400@70Hz by the video card and then line-doubled to 1280x800@70Hz by the OSSC). I was captured by a Cam Link 4K, whose ability to capture 70Hz and oddball resolutions I have just discovered .

See Re: VGA Capture Thread

Reply 242 of 295, by cde

User metadata
Rank Member
Rank
Member

Thank you darry! I have received my OSSC and I'm pleased to say that together with my 144 Hz AOC 2590PX monitor everything works perfectly 😀 I've setup two profiles in the OSSC:

Profile 0 (used for 80x25 text mode):

  • Settings opt / Link prof->input / AV3: RGBHV
  • Sampling opt / 400p in sampler / VGA 720x400@70
  • Output opt / 480p-576p proc / Line2x

Profile 1 (used for 320x200 games):

  • Settings opt / Link prof->input / AV3: RGBHV
  • Sampling opt / 400p in sampler / VGA 640x400@70
  • Output opt / 480p-576p proc / Line2x

Below is a zoomed in screenshot of DOOM with Profile 1 and 17" 4:3 mode on the monitor. There is no input lag, the game is completely smooth (and Quake even more so).

20200905_133539-2.jpg
Filename
20200905_133539-2.jpg
File size
1.44 MiB
Views
2171 views
File comment
DOOM
File license
GPL-2.0-or-later

The AOC G2590PX, in addition to 17" and 19" 4:3 modes, also has an excellent 1:1 mode (image is displayed as-is, centered). Together with Line2x above it provides pixel perfect doubling of 640x480 (and quadrupling of 320x240). Below are shown Jazz Jackrabbit and Windows 3.1:

20200905_131303-2.jpg
Filename
20200905_131303-2.jpg
File size
902.83 KiB
Views
2171 views
File license
GPL-2.0-or-later
20200905_131859-2.jpg
Filename
20200905_131859-2.jpg
File size
1.03 MiB
Views
2171 views
File license
GPL-2.0-or-later

With this setup all the problems I've encountered previously are solved: lack of VGA input on certain monitors, 320x200 games no longer have duplicated pixels (since they are sampled at 640x400, not the incorrect 720x400), games that rely on the vertical refresh interval for their internal logic run at the correct speed, and the aspect ratio is a correct 4:3. The only drawback that I see is that 80x25 text mode doesn't look very good when sampled at 640x400, but it takes just two presses on the remote to go back to 720x400 sampling.

Overall I would definitely recommend the OSSC despite it being expensive (about 100E), it is absolutely worth it. However keep in mind a monitor that accepts 1280x800@70Hz without frame dropping and provides a 4:3 option is required. darry has tested the Philips 252B9 with success, and I confirm the AOC G2590PX also works great. Which one is better depends on the games played. I believe the 252B9 is probably best for 320x200. The G2590PX is great for 640x480 with Line2x plus 1:1 option, and is also great for modern games, being 144 Hz and FreeSync compatible.

Reply 243 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
cde wrote on 2020-09-05, 12:10:
Thank you darry! I have received my OSSC and I'm pleased to say that together with my 144 Hz AOC 2590PX monitor everything works […]
Show full quote

Thank you darry! I have received my OSSC and I'm pleased to say that together with my 144 Hz AOC 2590PX monitor everything works perfectly 😀 I've setup two profiles in the OSSC:

Profile 0 (used for 80x25 text mode):

  • Settings opt / Link prof->input / AV3: RGBHV
  • Sampling opt / 400p in sampler / VGA 720x400@70
  • Output opt / 480p-576p proc / Line2x

Profile 1 (used for 320x200 games):

  • Settings opt / Link prof->input / AV3: RGBHV
  • Sampling opt / 400p in sampler / VGA 640x400@70
  • Output opt / 480p-576p proc / Line2x

Below is a zoomed in screenshot of DOOM with Profile 1 and 17" 4:3 mode on the monitor. There is no input lag, the game is completely smooth (and Quake even more so).

20200905_133539-2.jpg

The AOC G2590PX, in addition to 17" and 19" 4:3 modes, also has an excellent 1:1 mode (image is displayed as-is, centered). Together with Line2x above it provides pixel perfect doubling of 640x480 (and quadrupling of 320x240). Below are shown Jazz Jackrabbit and Windows 3.1:

20200905_131303-2.jpg

20200905_131859-2.jpg

With this setup all the problems I've encountered previously are solved: lack of VGA input on certain monitors, 320x200 games no longer have duplicated pixels (since they are sampled at 640x400, not the incorrect 720x400), games that rely on the vertical refresh interval for their internal logic run at the correct speed, and the aspect ratio is a correct 4:3. The only drawback that I see is that 80x25 text mode doesn't look very good when sampled at 640x400, but it takes just two presses on the remote to go back to 720x400 sampling.

Overall I would definitely recommend the OSSC despite it being expensive (about 100E), it is absolutely worth it. However keep in mind a monitor that accepts 1280x800@70Hz without frame dropping and provides a 4:3 option is required. darry has tested the Philips 252B9 with success, and I confirm the AOC G2590PX also works great. Which one is better depends on the games played. I believe the 252B9 is probably best for 320x200. The G2590PX is great for 640x480 with Line2x plus 1:1 option, and is also great for modern games, being 144 Hz and FreeSync compatible.

Thank you for sharing your experience, cde . Your AOC monitor seems very flexible (with 1:1 and multtiple 4:3 modes) and a great choice for both retro and modern setups . Being able to get a no compromises retro display setup using only modern in-production monitors and OSSC is a great thing . The only caveat I see is, as always, price: the combined cost of the OSSC AND an appropriate monitor is quite high indeed . Then again, this being a long-term solution, I would say it it is worth it .

Reply 244 of 295, by cde

User metadata
Rank Member
Rank
Member

Indeed, it's my backup plan in case my CRT dies and I can't find this model 😀

I've also been toying with OSSC's scanline options. It's nice to have, although I haven't been able to replicate that CRT feel -- and it's probably not doable anyway without a much higher DPI, closer to this:

https://upload.wikimedia.org/wikipedia/common … _rgb_matrix.jpg

Reply 246 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++

Depending on what it ends up costing, an OSSC Pro might end up changing the cost dynamic, if the need for a 4:3 mode can be eliminated .

The AOC 2590PX is 330 US$ , the Philips 252B9 is about 223 US$ and the Acer VW257 or its Displayport equipped variant, the BW257 (with an aspect mode but no 4:3 mode) is less than 200 US$ . There are probably even cheaper decent 70Hz capable displays without a 4:3 mode . Being able to use one of those cheaper monitors, could make an OSSC Pro worthwile if it expands monitor options and allows for a cheaper total cost, especially if it means being able to use a monitor that you already own .

That and the potential MiSTer compatibility that has been hinted at by OSSC Pro devs could really make this a fantastic product . I am still hyped, even if I do not absolutely need one right now . 😉

Reply 247 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
bestemor wrote on 2020-09-05, 17:26:

Ack... now I really REALLY want an OSSC !! 😥

Stop posting all this resolution pron guys !

Sorry about that . I guess we are just too hyped about this excellent product .

Hopefully, OSSC prices will go down even further once the Pro version launches .

Reply 249 of 295, by Horun

User metadata
Rank l33t++
Rank
l33t++
darry wrote on 2020-08-29, 03:47:

DVI-I normally includes DVI-D functionality plus analogue RGB (VGA) .

EDIT: If you are not getting a picture from a DVI-I equipped card, my guess is the card does not like the EDID for some reason . My DVI-I (not that being DVI-D would change anything here) equipped Geforce FX 5900 and FX 5900 XT display no picture on my Acer VW257 unless I feed them a modded simplified EDID . The same cards work fine at native resolution (1920x1200) on the Philips 252B9 . The Acer has no such issues with modern video cards .

Actually, thinking about it, the vBIOS in those Nvidia cards likely attempt to scale to whatever is the native resolution according to in the first 128-bye EDID block . If the pixel clock for that is too high for DVI 1.0, I would imagine no picture would result, unless the vBIOS logic has a fallback mechanism (which I doubt) . I will not be testing that because I just put my Acer into storage (never thought of dumping its EDID) and testing this would imply re-programming one of my EDID emulators, which would require I use a monitor with programmable EDID as an intermediary, which is a pain .

Just a follow up and thanks again. That monitor has a DVI-DL input (according to it's full spec sheet) and will not work with DVI-I or DVI-D sinle link cables, it has to be a DVI-DL cable even when connected to a DVI-I output. The manual states "DVI-D cable... to the monitor." I swear I tested with one short 3' DVI-D single link cable and it worked but not with any of the 6' tried (and the 3' went back on my work machine and not going to retest it).

Curious of any news on the release date of the OSCC Pro ? could not find anything other than speculation....

Last edited by Horun on 2020-09-05, 20:12. Edited 1 time in total.

Hate posting a reply and then have to edit it because it made no sense 😁 First computer was an IBM 3270 workstation with CGA monitor. Stuff: https://archive.org/details/@horun

Reply 250 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
Horun wrote on 2020-09-05, 20:05:
darry wrote on 2020-08-29, 03:47:

DVI-I normally includes DVI-D functionality plus analogue RGB (VGA) .

EDIT: If you are not getting a picture from a DVI-I equipped card, my guess is the card does not like the EDID for some reason . My DVI-I (not that being DVI-D would change anything here) equipped Geforce FX 5900 and FX 5900 XT display no picture on my Acer VW257 unless I feed them a modded simplified EDID . The same cards work fine at native resolution (1920x1200) on the Philips 252B9 . The Acer has no such issues with modern video cards .

Actually, thinking about it, the vBIOS in those Nvidia cards likely attempt to scale to whatever is the native resolution according to in the first 128-bye EDID block . If the pixel clock for that is too high for DVI 1.0, I would imagine no picture would result, unless the vBIOS logic has a fallback mechanism (which I doubt) . I will not be testing that because I just put my Acer into storage (never thought of dumping its EDID) and testing this would imply re-programming one of my EDID emulators, which would require I use a monitor with programmable EDID as an intermediary, which is a pain .

Just a follow up and thanks again. That monitor has a DVI-DL input (according to it's full spec sheet) and will not work with DVI-I only cables, it has to be a DVI-D cable even when connected to a DVI-I output. I missed that part in the manual where it states "DVI-D cable... to the monitor." I swear I tested with one short 3' DVI-I cable and it worked but not with any of the 6' tried (and the 3' went back on my work machine and not going to retest it).

Curious of any news on the release date of the OSCC Pro ? could not find anything other than speculation....

I was not aware that a DVI-I only cable was a thing . That would basically be a VGA cable with DVI-I connectors . I thought the only use of "I " part of a DVI connector was for analogue VGA compatibility when using a passive DVI-I to HD15 passive adapter .

Reply 251 of 295, by Horun

User metadata
Rank l33t++
Rank
l33t++
darry wrote on 2020-09-05, 20:11:

I was not aware that a DVI-I only cable was a thing . That would basically be a VGA cable with DVI-I connectors . I thought the only use of "I " part of a DVI connector was for analogue VGA compatibility when using a passive DVI-I to HD15 passive adapter .

Sorry I meant it has to be a DVI-DL cable, tried DVI-I and also tried a DVI-D single link and they did not work.

Hate posting a reply and then have to edit it because it made no sense 😁 First computer was an IBM 3270 workstation with CGA monitor. Stuff: https://archive.org/details/@horun

Reply 252 of 295, by dr_st

User metadata
Rank l33t
Rank
l33t

This is weird. Maybe it means the monitor does not support reduced blanking and actually requires dual-link DVI to reach 1920x1200 resolution. Or it implements some really bizarre and non-standard handshake. There is no technical reason why a single-link DVI should not work, and in fact I think the standard mandates it.

https://cloakedthargoid.wordpress.com/ - Random content on hardware, software, games and toys

Reply 253 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
Horun wrote on 2020-09-05, 20:14:
darry wrote on 2020-09-05, 20:11:

I was not aware that a DVI-I only cable was a thing . That would basically be a VGA cable with DVI-I connectors . I thought the only use of "I " part of a DVI connector was for analogue VGA compatibility when using a passive DVI-I to HD15 passive adapter .

Sorry I meant it has to be a DVI-DL cable, tried DVI-I and also tried a DVI-D single link and they did not work.

No problem . Upon further checking, maybe DVI-I (analogue only are not a thing), but DVI-I dual link cables (analogue and dual link digital combined) seem to be a thing .

Upon some further memory jogging, I seem to recall there might have existed some CRT monitors with DVI-I connectors of which only the analogue part was functional. Maybe somebody can either confirm that or confirm that I might need some meds . 😉

Reply 254 of 295, by Horun

User metadata
Rank l33t++
Rank
l33t++
darry wrote on 2020-09-05, 20:23:
Horun wrote on 2020-09-05, 20:14:
darry wrote on 2020-09-05, 20:11:

I was not aware that a DVI-I only cable was a thing . That would basically be a VGA cable with DVI-I connectors . I thought the only use of "I " part of a DVI connector was for analogue VGA compatibility when using a passive DVI-I to HD15 passive adapter .

Sorry I meant it has to be a DVI-DL cable, tried DVI-I and also tried a DVI-D single link and they did not work.

No problem . Upon further checking, maybe DVI-I (analogue only are not a thing), but DVI-I dual link cables (analogue and dual link digital combined) seem to be a thing .

Upon some further memory jogging, I seem to recall there might have existed some CRT monitors with DVI-I connectors of which only the analogue part was functional. Maybe somebody can either confirm that or confirm that I might need some meds . 😉

I do not have any DVI-I Dual Link cables or would test with one. Am just going from the manual/specs, cables I have and this pinout of diff DV connectors: https://displaygeeks.com/wp-content/uploads/2 … es-1024x453.jpg?

Hate posting a reply and then have to edit it because it made no sense 😁 First computer was an IBM 3270 workstation with CGA monitor. Stuff: https://archive.org/details/@horun

Reply 255 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
Horun wrote on 2020-09-05, 20:35:
darry wrote on 2020-09-05, 20:23:
Horun wrote on 2020-09-05, 20:14:

Sorry I meant it has to be a DVI-DL cable, tried DVI-I and also tried a DVI-D single link and they did not work.

No problem . Upon further checking, maybe DVI-I (analogue only are not a thing), but DVI-I dual link cables (analogue and dual link digital combined) seem to be a thing .

Upon some further memory jogging, I seem to recall there might have existed some CRT monitors with DVI-I connectors of which only the analogue part was functional. Maybe somebody can either confirm that or confirm that I might need some meds . 😉

I do not have any DVI-I Dual Link cables or would test with one. Am just going from the manual/specs, cables I have and this pinout of diff DV connectors: https://displaygeeks.com/wp-content/uploads/2 … es-1024x453.jpg?

Are you using your Benq BL2711U ? I did not see any reference to requiring a DVI dual link cable, but it is obvious that without dual-link capability, getting that monitor's native resolution out of the video card would be impossible .

Now that I think about, that's probably the problem ! The vBIOS on Nvidia cards tries to output monitor native resolution over DVI/HDMI, so if native resolution requires dual-link DVI, the video card must support dual-link AND the cable must be DVI dual-link . The only way around this on a single link card, or without a dual-link cable, would be to force the card to output a resolution that it is capable of using an EDID emulator . A quick way to test this would be to boot the PC into DOS using another DVI monitor it does work with and to switch the cable to the Benq. If that works, the hypothesis is confirmed (Nvidia vBIOS on the cards I have only parses EDID at boot, not on a reconnect event).

If the card is HDMI capable and supports the monitor's native resolution over HDMI, using an HDMI cable would work .

Reply 256 of 295, by Horun

User metadata
Rank l33t++
Rank
l33t++
darry wrote on 2020-09-05, 20:57:

Are you using your Benq BL2711U ? I did not see any reference to requiring a DVI dual link cable, but it is obvious that without dual-link capability, getting that monitor's native resolution out of the video card would be impossible .

Now that I think about, that's probably the problem ! The vBIOS on Nvidia cards tries to output monitor native resolution over DVI/HDMI, so if native resolution requires dual-link DVI, the video card must support dual-link AND the cable must be DVI dual-link . The only way around this on a single link card, or without a dual-link cable, would be to force the card to output a resolution that it is capable of using an EDID emulator . A quick way to test this would be to boot the PC into DOS using another DVI monitor it does work with and to switch the cable to the Benq. If that works, the hypothesis is confirmed (Nvidia vBIOS on the cards I have only parses EDID at boot, not on a reconnect event).

If the card is HDMI capable and supports the monitor's native resolution over HDMI, using an HDMI cable would work .

Yes the BenQ. Yes the HDMI works just fine. The res has been locked at 1080p mode out of the vid card. And think you are correct something about the Nvidia vBios (on that Gtx960 and 7100GS). Hooked up the Asus VS 228H-P monitor using DVI-D single link cable to DVI-I on 960 card then removed cable from Asus and hooked to BenQ and it lit up. Booting computer with same DVI-D SL cable will not wake up the monitor, but with a DVI-DL cable to the card DVI-I port it will. Did not boot to DOS using that cable but am fairly sure it would not wake the monitor if my booting into Win7 does not.
Am going to use HDMI cable but was experimenting to see why I got no signal off a 7100GS DVI out in another computer that I booted up using same DVI-D SL cable.

Hate posting a reply and then have to edit it because it made no sense 😁 First computer was an IBM 3270 workstation with CGA monitor. Stuff: https://archive.org/details/@horun

Reply 257 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++

Added some more Voodoo 3 and FX 5900 VGA through OSSC vs various Nvidia DVI cards doing their upscaling comparisons to this thread .Re: I tested several pci video cards for picture quality in dos games.

Reply 258 of 295, by cde

User metadata
Rank Member
Rank
Member

I found out an interesting behaviour of the GeForce GTX 960: when the VGA output is directly connected to the monitor, it outputs 1920x1080@60 which is not helpful.

However when connected to the OSSC instead, it produces a correct picture without doubled pixels when 320x200 (such as DOOM) is sampled at 720x400 instead of 640x400. So instead of doubling the lines, the GTX 960 is doing more complex work of resampling each 320 pixel line to 720. Then if I unplug the OSSC and plug the VGA cable into the monitor (without rebooting) in 1:1 mode, the image shown is indeed correct (this AOC G2590PX monitor assumes 720x400). However the image produced by other cards with the standard 320x200 doubling to 640x400 is still quite superior IMO when using the OSSC.

Image with OSSC sampling at 720x400:

20200906_124520.jpg
Filename
20200906_124520.jpg
File size
1.8 MiB
Views
2027 views
File license
GPL-2.0-or-later

Image with OSSC sampling at 640x400. Note how the L in BULL is too thin:

20200906_124537.jpg
Filename
20200906_124537.jpg
File size
1.75 MiB
Views
2027 views
File license
GPL-2.0-or-later

Image without OSSC, VGA directly connected to the monitor in 1:1 mode (no scaling):

20200906_124826.jpg
Filename
20200906_124826.jpg
File size
1.01 MiB
Views
2027 views
File license
GPL-2.0-or-later

Reply 259 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
cde wrote on 2020-09-06, 11:02:
I found out an interesting behaviour of the GeForce GTX 960: when the VGA output is directly connected to the monitor, it outpu […]
Show full quote

I found out an interesting behaviour of the GeForce GTX 960: when the VGA output is directly connected to the monitor, it outputs 1920x1080@60 which is not helpful.

However when connected to the OSSC instead, it produces a correct picture without doubled pixels when 320x200 (such as DOOM) is sampled at 720x400 instead of 640x400. So instead of doubling the lines, the GTX 960 is doing more complex work of resampling each 320 pixel line to 720. Then if I unplug the OSSC and plug the VGA cable into the monitor (without rebooting) in 1:1 mode, the image shown is indeed correct (this AOC G2590PX monitor assumes 720x400). However the image produced by other cards with the standard 320x200 doubling to 640x400 is still quite superior IMO when using the OSSC.

Image with OSSC sampling at 720x400:

20200906_124520.jpg

Image with OSSC sampling at 640x400. Note how the L in BULL is too thin:

20200906_124537.jpg

Image without OSSC, VGA directly connected to the monitor in 1:1 mode (no scaling):

20200906_124826.jpg

The GTX960 upscaling to monitor native (according to EDID) over VGA in DOS is a "relatively new" behaviour of Nvidia cards . My GTX750 Ti does it . My Geforce 6200 and FX 5900 do not . The reason it does not happen when connected to OSSC is probably because the OSSC has no VGA EDID, so the cards default to passthrough . As with DVI, over VGA, Nvidia cards whose vBIOS upsamples to monitor native only seems to check the EDID at boot, so if you disconnect a monitor or OSSC after booting, the card will keep on acting as if it was still connected to what it was booted with .

I am pretty certain that you have inverted your 720x400 and 640x400 sampling pictures . 640x400 sampling should be the one that displays correctly (proper letter width in Doom status bar) when VGA card is in 320x200 .

EDIT : I am really not sure with there are no sampling artifacts in your 1:1 VGA screen photo .
EDIT2: Now I understand .

Last edited by darry on 2020-09-08, 03:25. Edited 2 times in total.