VOGONS


Reply 220 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
bestemor wrote on 2020-08-30, 23:18:
Sorry for late reply, but I've been trying VERY hard to find the hardware I used in that 'test' I believe I posted about somwher […]
Show full quote
darry wrote on 2020-08-28, 01:56:
So the 6800GT does display a picture over VGA when running Doom, but the monitor says it's getting 1920x1080, correct ? If that […]
Show full quote
bestemor wrote on 2020-08-28, 01:13:
I mean, I play Doom (on DOS 6.22) on an older card, and I get 720x400 on that monitor. But doing the same with a 6800GT, it sud […]
Show full quote

I mean, I play Doom (on DOS 6.22) on an older card, and I get 720x400 on that monitor.
But doing the same with a 6800GT, it suddenly shows as 1920x1080 instead. Although my memory might be a little bit fuzzy on the actual details, I DO remember it not showing the correct resolution anymore...

And NOTHING else has been changed in hardware or software, than that card. (on a Slot1 mobo)
Hence the monitor is fine, and I blame the card, which for some strange reason is suddenly chosing to force native(monitor) resolution only.
So perhaps a VGA EDID emulator would fix that (?).

To clarify, I am mostly talking about over VGA. Because of said problems with that 6800GT.
But if an EDID emulator fixes this(AND the proper resolution!), both on VGA and HDMI, then great... 😁
(any suggestions of an idiot proof, easy to use, reasonably priced, and actually obtainable model? VGA or HDMI or DVI)

- Sadly I really know very little about any of these things, programming EDID or anything else 'fancy' - still want my monitor(s) to show 720x400@70hz though, somehow...
Basically trying to 'future proof' for the time when my VGA monitors die... as well as fixing some current issues.

What could a combo of Extron 300 DVI + EDID VGA emulator do I wonder ?
(VGA card->emulator->Extron->monitor, or do we need an DVI/HMDI EDID emulator after the Extron instead ?)

So the 6800GT does display a picture over VGA when running Doom, but the monitor says it's getting 1920x1080, correct ? If that is the case, I would think that the video card is upscaling 720x400 (actually 640x400, but the monitor interprets as 720x400) to 1920x1080 over VGA, which surprises me . Are you sure you weren't running over DVI, because that would be the expected behaviour in that case ? I have never seen an Nvidia card upscale over VGA, but then again, I have not used Nvidia VGA outputs in a very long time .

As mentioned before, the Extron 300 family will force everything to 60 Hz and there is no way around that . If that is not an issue for you, it is one of the best, cheapest and simplest scaling solutions available . When using an Extron 300 family scaler/digitizer, VGA and DVI/HDMI EDID emulators are useless . The Extron lets you override DVI/HDMI EDID on its output and I do not see why you would need an EDID emulator on its VGA input .

To make it easier to understand what you need/want, allow me to ask you some questions .

a) What video card(s) do you intend to use and what output(s) do(es) it/they have ?
b) Do you already have an LCD monitor lined up for use in a retro setup (which model ?) or are you planning on purchasing one (now or when your CRT(s) dies(s) ) ?
c) Is maintaining 70Hz in DOS games important for you or would conversion to 60Hz be acceptable ?
d) Does the uneven scaling/digitization of 640x400 (line doubled 320x200) as 720x400 by essentially all LCD monitors over VGA input bother you ?
e) Does having 4:3 resolutions stretched to 16:9 or 16:10 on a widescreen monitor bother you or not ?
f) When running over DVI/HDMI, Nvidia cards typically upscale to a monitor's native resolution but the result is rather soft looking. Does this bother you ?

I know this is a lot of questions, but it will help a lot if these points can be cleared up first . I went through a lot of rather expensive trial and error and after the fact realizations about what I wanted and would like to help you avoid that . If some of these points seem unclear or gibberish, do not hesitate to ask for clarification .

Sorry for late reply, but I've been trying VERY hard to find the hardware I used in that 'test' I believe I posted about somwhere long time ago...
Oh well... Either my memory is not what it used to be, or I just cannot find the card, and have hence not been able to replicate it so far.
Meaning, ALL my 6800GT(X) cards (that I could find) are AGP 2.0, which means NONE did fit into my agp port at ALL when I tried just now.... 🤔

So, was it an ATI card I used, or which card was it I wonder ?

As for the questions:
(so far all I've used is monitors with VGA input, not tried DVI-D or anything else for this)

a) : Any card with VGA output, I suppose. From S968 and up. I have a selection from all eras, but haven't really landed on any specific card(s) yet.
I suppose I want to keep this open, when looking for a suitable monitor. But strictly speaking, I guess the line goes at when cards where starting to have BOTH vga and dvi connectors en masse. But even then, having the option of 75hz refresh rates (over analog) is something I'd want to preserve as much as possible for various reasons. For those monitors that allow for it...
- So perhaps I should settle for DVI ? For my Gforce4 cards an up ? I don't know...

b) I figured the EIZO FS2333 could do the job, modern-ish (durability), reasonbly good scaling and picture quality, and has native VGA-in.
I DO have some other models as well, older, with VGA-in, but not sure how long they'd last.
BUT, I am still/always looking for a 'better'/more versatile or retro friendly modern (longevity/backup) model, so.... open for suggestions (!).

c) As for 70hz , well... I would like* to have an option of higher than 60hz available.
And there are times when the programs themselves FORCE 70-72 hz, and I end up with a black screen if the current monitor does not cope with well that (had one old S3 card send out 71hz and the modern monitor I was testing at the momentt tanked/could only tolerate 70hz max).
(*: as I DO notice a real difference with 75hz with certain hardware combinations in windows)

d) Well, I've tried testing out things using Doom, since it supposedly is a 320x200 mode game - though I can't recall ever seeing that resolution anywhere, even in its 640x400 incarnation. Granted, have not done any testing on CRT that I can recall, but I have a suspicion of that the graphics card would not give out that resolution, regardless of monitor... (?) I'm not that into these things though, so maybe if I had the 'right' (oold) vga card it would ? Please correct me on any of the above...

e) Now, THIS bothers me, yes. And the only monitor I currently own with forced 4:3 setting is a Dell U2412M (1920x1200 native, but does not support 75hz at all/any resolutions).
The EIZO just has 'enlarged' while keeping original aspect ratio, but having some settings that allow for 75hz which I like.

f) Soft/fuzzy/smudgy upscaling I do not like. At all. I am always checking out the 'aspect ratio' testing when reading any monitor reviews, to see how bad it appears as on non-native resolutions. Some monitors are better than others, or so it seems ? But are you now telling me it is really/mostly the cards doing ?

As for an EDID emulator, I figured it could be used to help out vs point C) above, when the monitor would otherwise have problems with refresh or resolutions coming out of my old PC/card etc... (?) While still employing regular VGA cable(s) 'directly', as in no signal conversion to HDMI or whatever.
If that makes any sense ?

Thank you for your answers .

To address some of the points which may not be quite clear .

a) long term VGA, input is likely going away, so you will likely need something like an OSSC or OSSC Pro to handle conversion to DVI/HDMI

b) The EIZO FS2333 seems like a nice monitor, but the lack of an explicit 4:3 mode worries me. More on that in point e) .

c) You will need to test whether the EIZO FS2333 skips frames at > 60Hz, whether it handles the resolutions you want and the refresh rates you want and, if considering OSSC, whether it handles 1280x800@70Hz and 1440x400@70Hz (640x400@70Hz and 720x400@70Hz line doubled by OSSC) . All of these can be tested using a modern PC and generating custom resolutions using the Nvidia driver.

d) Over VGA analogue output 320x200 (like in Doom) is always line doubled to 640x400 , but most if not all LCD monitors will detect it improperly as 720x400 and digitize it with the wrong sampling rate, which will give artifacts (which may or not be very noticeable to you) . CRT monitor may "detect" 640x400 as 720x400 , but will display it properly because there is no digitization taking place, so the picture will be displayed correctly regardless . Over DVI/HDMI in DOS, on Nvidia cards, 640x400 will be upscaled to monitor native resolution, and stretched horizontally if the monitor is widescreen .

e) You will have to tell me how if it actually works that way, but I doubt the "enlarged" mode on the EIZO FS2333 properly displays 640x400 (line-doubled 320x200) under DOS as 4:3 . I would expect this mode to properly display square pixel 4:3 modes like 640x480 , 800x600 , 1024x768, etc as 4:3 , however .

f) Over DVI/HDMI in DOS on an Nvidia card, it's the video card doing the upscaling to th monitor's native resolution, not the monitor . On cards at least up to the Geforce FX series, this gives a rather soft picture when upscaling 640x400 (line-doubled 320x200) .

As for signal conversion, LCD monitors are inherently digital devices, even those equipped with only VGA input . The only thing you "gain" when feeding an LCD monitor with an analogue VGA signal is the ability to force the monitor to do the upscaling to native resolution . No VGA card that I know of (I would welcome being proven wrong) upscales it's VGA output, so what comes out of that analogue output is, under DOS, 640x400@70Hz (line-doubled 320x200) or 720x400@70Hz (text mode) in standard VGA mode. If using SVGA modes under DOS, those will also come out "as is" . Though sending DVI/HDMI to am LCD monitor implies less processing and image degradation due to successive conversions AD and DA conversions, VGA may be more practical and flexible, especially considering point f .

So, if I understand correctly, you want to be able to handle VGA input, possibleyDVI/HDMI input, >60Hz , maintain 4:3 aspect ratio and like your upscaling sharp and crisp . You also already have an EIZO FS2333 monitor that you would like to use .

Assuming you can get proper 70Hz and higher at 4:3 (when appropriate) out of the EIZO FS2333 (points c and e ), I would consider getting an OSSC for cards with VGA output . IMHO, OSSC does such a fine job a digitizing even higher resolutions like 1600x1200 received over VGA, that using DVI/HDMI is not really all that necessary . Though nothing is preventing you from running one card for high resolutions in Windows (and occasionally SVGA in DOS) over DVI/HDMI and another for VGA output through an OSSC .

If you do not want to spend on an OSSC, the only things you will lose are the extra sharpness of line-doubling by OSSC (rather than scaling directly from say 640x400@70Hz to monitor native) and the ability to correct for the issue in point d) . This is assuming the monitor you are using properly handles >60Hz over VGA .

If your EIZO FS2333 does not display DOS VGA resolutions as 4:3 (point e) . I suggest you either consider a different monitor (because OSSC will not be able to fix that), wait for OSSC Pro or give up on >60Hz and use an Extron DVI 300 or similar to upscale to a 4:3 square pixel resolution that most monitor will display as 4:3 .

Sorry If I missed something along the way . As always, I invite everyone interested to correct or challenge me as necessary .

Reply 221 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
dr_st wrote on 2020-09-02, 05:30:
darry wrote on 2020-09-02, 04:35:

The only mid-2000s monitors I have left are a Samsung 204B and a Del U2412M, both of which are 4:3 and have recently been put back in storage .

I suppose you are referring to your 2007FP, since the U2412M is neither 4:3 nor from the mid-2000s.

Indeed 2007FP . sorry about the brain fart .

Reply 222 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
darry wrote on 2020-08-26, 05:08:

I dug up the XP machine and hooked it up to my Acer EB321HQ over HDMI to run some baseline tests first in the hope of doing so VGA testing on the Philips 252B9 .

Unfortunately vsynctester.com does not seem to work well with the last XP compatible versions of Firefox or Chrome (had to load it through archive.org to bypass the SSL issue Chrome was having with the site). Using either of those browsers, the graph indicates frame drops even before getting to the monitor, and that's at 1920x1080@60Hz . The machine is third or fourth generation Core i3 (2 cores) with a Geforce GTX750TI, so I doubt it's the CPU or GPU's fault, though for some reason I did not think of checking CPU load .

If it's not a CPU load issue, I might do a Windows 10 test install on a spare drive to have a usable testbed . Anyway, it's getting late here, so I will likely test more tomorrow .

Sorry that this is taking forever . I finally installed Windows 10 on the machine and vsynctester.com works fine at least up to 75Hz at 1920x1080 on mt Acer EB321HQ over HDMI (I was wrong about the CPU, it is an I3 2100 with 65W TDP , that will get "upgraded" to an I3 3220T with 35W TDP) . I had PSU issues (now fixed) and XP /Windows 10 co-existence issue issues (also fixed) and plenty of other issues (LS-120 compatibility, JMB363 forced IDE mode, etc) that would probably have deserved another thread , but I digress.

I should be moving on to VGA tests soon . Before I do, though, I did test some DOS stiff over DVI/HDMI on the Philips 252B9 :

a) DOS upscaling to 1600x1200@70Hz with reduced blanking using an EDID emulator on Geforce GTX750Ti with Philips 252B9 DOES NOT WORK (no picture) (works fine on FX 5900)
b) Nvidia scaling of Doom from 640x400@70Hz (line-doubled 320x200) over HDMI/DVI and scaled to 1920x1200@60Hz, whether forced to 4:3 or not by the monitor, seems a lot sharper on the GTX750Ti than on the the FX 5900 .

EDIT (DVI and HDMI outputs on my Geforce 750Ti have completely different behaviour!!!):
I did test some DOS stuff over DVI and HDMI on the Philips 252B9 :

a) DOS upscaling to 1600x1200@70Hz with reduced blanking using an EDID emulator on Geforce GTX750Ti HDMI output with Philips 252B9 DOES NOT WORK (no picture) (works fine on FX 5900)
b) DOS upscaling to 1600x1200@70Hz with reduced blanking using an EDID emulator on Geforce GTX750Ti DVI output with Philips 252B9 WORKS FINE
c) Nvidia scaling of Doom from 640x400@70Hz (line-doubled 320x200) over HDMI or DVI and scaled to 1920x1200@60Hz (HDMI) or 1600x1200@70Hz (DVI with forced EDID) seems a lot sharper on the GTX750Ti than on the the FX 5900 .

Reply 223 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++

So I finally ran some tests on the Philips 252B9 in VGA mode and I learned a few things along the way .
I used a Geforce GX 750 Ti (using a passive DVI-I to VGA adapter) and my trusty FX 5900 (VGA port) .

Observations :

a) Over VGA the GTX750 Ti DOES SCALE to native resolution (1920x1200) in DOS!!! I was flabbergasted (when did Nvidia start doing that?) .
b) The FX 5900 DOES NOT scale to native resolution over VGA. Under DOS 640x400@70Hz (line-doubled 320x200) and 720x400@70Hz are detected as 720x400@70Hz . This was expected .
c) Over VGA, the GTX750 Ti in Windows 10 does allow for 1600x1200@70Hz without frame-skipping on the Philips 252B9 according to vsynctester.com
d) Over VGA, the GTX750 Ti in Windows 10 could not be convinced to output 1600x1200@60Hz without scaling to 1920x1200@60Hz not matter what scaling option I chose. I believe this is a driver bug (current Windows update driver, did not check version).
e) Over VGA, the GTX750 Ti in Windows 10 displayed 1600x1200@70Hz (custom resolution) without scaling (monitor saw 1600x1200@70Hz) .
f) Considering point, b you have likely already guessed that, when using VGA input on the Philips 252B9, 640x400@70Hz (line-doubled 320x200) is improperly sampled as 720x400@70Hz as is common to all LCD monitors when using VGA input .
g) As mentioned in a previous post the GTX750 Ti upscaling of 640x400270Hz (line-doubled 320x200) to monitor native resolution (1920x1200) is sharper than the FX 5900's (DVI only) upscaling . This is is true for VGA mode as well on the GTX750Ti . Using an OSSC in line2x mode gives sharper results than either the FX 5900 or the GTX750 Ti . In all cases, using 4:3 mode preserved proper aspect ratio in DOS .

Additional notes : Quake under DOS crashes at start under DOS and Duke Nukem 3D is flickering and/or slow in VESA modes on a GTX 750 Ti , whatever the output used .

Conclusion :
1) If you don't mind point f, using VGA input for retro purposes on the Philips 252B9 is an option (70Hz and proper aspect ratio in DOS modes are preserved).
2) If you do mind point f and/or want even sharper scaling in DOS modes, get an OSSC along with the Philips 252B9 .
3) Using a GTX750 Ti for retro purposes is not recommended . (Duh!)

Reply 225 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
cde wrote on 2020-09-03, 10:04:

Excellent darry! Thanks for the testing. How does the monitor handle the doubled 640x400 output from OSSC? Do you get pixel-perfect scaling?

Well, it is pixel perfect up to 1280x800 thanks to OSSC, then the monitor scales it to 1600x1200 . IMHO, that gives a decently sharp result where the original 320x200 pixels are distinguishable . At the same time, the final bit of scaling from 1280x800 to 1600x1200 softens things up just a little, which is not unpleasant, IMHO . I will post some pictures of the Doom HUD to give an idea .

Reply 226 of 295, by bestemor

User metadata
Rank Oldbie
Rank
Oldbie
darry wrote on 2020-09-03, 05:25:

...
f) Considering point, b you have likely already guessed that, when using VGA input on the Philips 252B9, 640x400@70Hz (line-doubled 320x200) is improperly sampled as 720x400@70Hz as is common to all LCD monitors when using VGA input .
...

So, I now assume an (VGA) EDID emulator would have NO positive effect in this situation/end resolution ?
(as it seems to be the monitor itself that is the culprit when using the VGA input ?)

But it would work on the digital inputs - using a DVI/HDMI EDID emu I mean, albeit then only getting 60hz refhresh instead of 70hz ?

Barring any post-adjustments in Windows/graphic card driver settings, using DOS only.
(but in say Win98, powerstrip might alleviate some/all discrepancies ?)

...sorry if stupid questions, but man is this complicated... 😂

PS: you should perhaps try with an ATI card or 2 as well (AGP), as these seem to behave rather differently from nVidia...
The older PCI ones (ATI) might be 'better' in comparison I believe. But that is just my theory, having none such cards myself.

Reply 227 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
bestemor wrote on 2020-09-03, 21:03:
So, I now assume an (VGA) EDID emulator would have NO positive effect in this situation/end resolution ? (as it seems to be the […]
Show full quote
darry wrote on 2020-09-03, 05:25:

...
f) Considering point, b you have likely already guessed that, when using VGA input on the Philips 252B9, 640x400@70Hz (line-doubled 320x200) is improperly sampled as 720x400@70Hz as is common to all LCD monitors when using VGA input .
...

So, I now assume an (VGA) EDID emulator would have NO positive effect in this situation/end resolution ?
(as it seems to be the monitor itself that is the culprit when using the VGA input ?)

But it would work on the digital inputs - using a DVI/HDMI EDID emu I mean, albeit then only getting 60hz refhresh instead of 70hz ?

Barring any post-adjustments in Windows/graphic card driver settings, using DOS only.
(but in say Win98, powerstrip might alleviate some/all discrepancies ?)

...sorry if stupid questions, but man is this complicated... 😂

PS: you should perhaps try with an ATI card or 2 as well (AGP), as these seem to behave rather differently from nVidia...
The older PCI ones (ATI) might be 'better' in comparison I believe. But that is just my theory, having none such cards myself.

A VGA EDID emulator will allow you to force a given resolution out of a video card ONLY IF the VGA card's BIOS upscales to the native resolution in the EDID . AFAIK, older video cards do not upscale over VGA in DOS (FX 5900 and Radeon 9700, for example) . The GTX750 Ti does . Maybe your Geforce 6800 does the same (would explain the behaviour that you described). The monitor itself does not decide to scale . The monitor only scales when the video card provides it with a resolution it cannot display directly . If the video card does not scale over VGA and directly sends 640x400 to the monitor then and only then will the monitor misidentify 640x400 as 720x400 .

Over DVI, even most older Nvidia and ATI cards will scale to either whatever is native resolution or 1280x1024, for some reason . Using an EDID emulator over DVI output will allow you to customize what resolution and refresh rate that the video card sends to the monitor under DOS, within the limits of what the video card BIOS will let it . Since the video card is doing the processing, the 640x400 seen as 720x400 does not occur (at least on Nvidia cards) as the card "knows" it is starting out with a 640x400 image in the digital domain .

The only ATI card I have extensively tested was the Radeon 9700 . For some reason, it insisted on outputing 1280×1024@75Hz over DVI under DOS (regardless of EDID) . Thanks to cde's debugging of ATI vBIOS and to lots of trial and error on my part, I was able to to patch my Radeon 9700 vBIOS to get 1024x768@70Hz out of it . I quickly gave up on the Radeon 9700 though as both it and the 9200 I tried showed serious (IMHO) DOS VGA incompatibility by crashing on running Second Reality . I doubt later ATI/AMD would have improved in the DOS compatibility respect, so I do not recommend any Radeon 9xxx series card for DOS use . The 8500/9100 and 9200 probably have similar VGA cores, so I suspect they will have similar issue with VGA compatibility . The Radeon 9700 and 9200 did not upscale over VGA, neither did the Radeon 8500 I used to own . Finally, to close the Radeon chapter, I did test some Radeon 9600/9550 cards at some point, only under DVI, their vBIOS, AFAICR, varied in its behaviour depending on the card : some sent 1280x1024@75Hz under DOS and at least one them sent 1600x1200@60Hz . That was with a Samsung 204B with a native resolution of 1600x1200@60Hz in its EDID .

Powerstrip will work well for anything under Windows 9X, AFAIK, DOS VGA applications running under Windows 9x will be unaffected, unless, possibly, if the video card is doing upscaling . I could be wrong on that last point as I vaguely remember someone managing to control DOS refresh rates on a DVI connected Nvidia card .

I hope this helps clear up some things . It's ironic that what we are all chasing is basically so simple : A nice, sharp, 4:3 picture at the refresh rate we want, but there are so many variables that layer up to entangle things that it becomes quite complicated indeed . This is definitely a case of "the devil is in the details" if I have ever seen one .

Please keep the questions coming as long as necessary and I will do my best to answer them .

Cheers!

Reply 228 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++

FX5900 upscaling to 1600x1200 over DVI .

fx5900_1600x1200.jpg
Filename
fx5900_1600x1200.jpg
File size
1.1 MiB
Views
1786 views
File license
Public domain

Note: this is not mis-focused . It actually is that soft .

Voodoo 3 sending 640x400 over VGA to OSSC which line-doubles to 1280x800 and outputs over HDMI, monitor then upscales to 1600x1200 .

voodoo3_ossc_1280x800.jpg
Filename
voodoo3_ossc_1280x800.jpg
File size
1.21 MiB
Views
1786 views
File license
Public domain

EDIT : Monitor is a Philips 252B9 in 4:3 mode .
EDIT2 : Exposure was manually set at 1/15 second at ISO 50 using a Samsung Galaxy S7 .

Reply 229 of 295, by Oetker

User metadata
Rank Oldbie
Rank
Oldbie
darry wrote on 2020-09-04, 05:14:
FX5900 upscaling to 1600x1200 over DVI . fx5900_1600x1200.jpg Note: this is not mis-focused . It actually is that soft . […]
Show full quote

FX5900 upscaling to 1600x1200 over DVI .
fx5900_1600x1200.jpg
Note: this is not mis-focused . It actually is that soft .

Voodoo 3 sending 640x400 over VGA to OSSC which line-doubles to 1280x800 and outputs over HDMI, monitor then upscales to 1600x1200 .
voodoo3_ossc_1280x800.jpg

EDIT : Monitor is a Philips 252B9 in 4:3 mode .
EDIT2 : Exposure was manually set at 1/15 second at ISO 50 using a Samsung Galaxy S7 .

If I understand correctly, you're scaling to 1280x800 so you can stay at 70Hz right? Can the OSSC, or some other cheaper device, circumvent the 720x400 issue as long as I'm willing to drop down to 60Hz?
Also what does actual 720x400 text mode look like when scaled by a scaler which is set up for 320x200; missing pixels?

Reply 230 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
Oetker wrote on 2020-09-04, 06:13:
darry wrote on 2020-09-04, 05:14:
FX5900 upscaling to 1600x1200 over DVI . fx5900_1600x1200.jpg Note: this is not mis-focused . It actually is that soft . […]
Show full quote

FX5900 upscaling to 1600x1200 over DVI .
fx5900_1600x1200.jpg
Note: this is not mis-focused . It actually is that soft .

Voodoo 3 sending 640x400 over VGA to OSSC which line-doubles to 1280x800 and outputs over HDMI, monitor then upscales to 1600x1200 .
voodoo3_ossc_1280x800.jpg

EDIT : Monitor is a Philips 252B9 in 4:3 mode .
EDIT2 : Exposure was manually set at 1/15 second at ISO 50 using a Samsung Galaxy S7 .

If I understand correctly, you're scaling to 1280x800 so you can stay at 70Hz right? Can the OSSC, or some other cheaper device, circumvent the 720x400 issue as long as I'm willing to drop down to 60Hz?
Also what does actual 720x400 text mode look like when scaled by a scaler which is set up for 320x200; missing pixels?

The way OSSC works, input refresh rate always equals output refresh rate . This means that if the video card outputs 640x400@70Hz (line-doubled 320x200@70Hz) into OSSC, then OSSC will output 70Hz, no matter which output resolution is chosen .

The reason I chose 1280x800 (line2x) is for maximum sharpness . Outputting 640x400 (passthrough) out of the OSSC would mean having the monitor do all the upscaling which is less sharp . Outputting 1600x1200 (line3x) would have been sharpest, but is out of spec for OSSC (and does not work well on my unit).

If 60Hz is acceptable to you, but you want proper 640x400 and 720x400 sampling, an Extron DVI or HDMI 300 series unit is an option . You will need to define a preset for each of those modes and chose it when appropriate (OSSC requires manual intervention too). The Extron can be had for much less than OSSC, AFAIK .

As for 720x400 sampled as 640x400 and vice versa, I should be able to post some comparison photos later using OSSC . That should represent the worst case scenario, as with the OSSC's sharpness , the improper sampling effect will be very apparent . With an Extron scaler, the effect will likley be less visible because of filtering/smoothing .

EDIT: Corrected typo

Last edited by darry on 2021-07-22, 12:46. Edited 1 time in total.

Reply 232 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
Oetker wrote on 2020-09-04, 15:52:

What about setting the OSSC to 1600x400 and letting the monitor scale to 1200? Apparently it supports 1600x400 for mode 13h?

1600x400 without line3x cannot be set as such, the menu OSCC only allows 1600X400 with line3x (y axis), which is effectively 1600x1200 (where x axis is sampled as 1600 active pixels and y axis is sampled at 400 but 3x line multiplied), which is out of spec for OSSC and generates shimmer/glitches/artifacts on some (many?) OSSC units, including mine . Asking developpers for an explicit 1600x400 (or even a 1600x800) mode could be an option (these may or not be possible, however, due to OSSC limitations) . How much better/sharper 1600x400 would be than 1280x800, if at all would depend on the monitor used . 1600x800 would definitely be sharper than 1280x800, though .

EDIT : Corrected accuracy .

Last edited by darry on 2020-09-04, 18:18. Edited 1 time in total.

Reply 233 of 295, by Oetker

User metadata
Rank Oldbie
Rank
Oldbie
darry wrote on 2020-09-04, 16:14:
Oetker wrote on 2020-09-04, 15:52:

What about setting the OSSC to 1600x400 and letting the monitor scale to 1200? Apparently it supports 1600x400 for mode 13h?

1600x400 cannot be set as such, the menu only allows 1600x1200 (where x axis is sampled as 1600 active pixels and y axis is sampled at 400 but 3x line multiplied), which is out of spec for OSSC and generates shimmer/glitches/artifacts on some (many?) OSSC units, including mine . Asking developpers for an explicit 1600x400 (or even a 1600x800) mode could be an option (these may or not be possible, however, due to OSSC limitations) . How much better/sharper 1600x400 would be than 1280x800, if at all would depend on the monitor used . 1600x800 would definitely be sharper than 1280x800, though .

I was going by https://github.com/marqs85/ossc/commit/4a686d … 2f1c1cd3ab1d5d9 but maybe I misunderstood.

Reply 234 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
Oetker wrote on 2020-09-04, 18:04:
darry wrote on 2020-09-04, 16:14:
Oetker wrote on 2020-09-04, 15:52:

What about setting the OSSC to 1600x400 and letting the monitor scale to 1200? Apparently it supports 1600x400 for mode 13h?

1600x400 cannot be set as such, the menu only allows 1600x1200 (where x axis is sampled as 1600 active pixels and y axis is sampled at 400 but 3x line multiplied), which is out of spec for OSSC and generates shimmer/glitches/artifacts on some (many?) OSSC units, including mine . Asking developpers for an explicit 1600x400 (or even a 1600x800) mode could be an option (these may or not be possible, however, due to OSSC limitations) . How much better/sharper 1600x400 would be than 1280x800, if at all would depend on the monitor used . 1600x800 would definitely be sharper than 1280x800, though .

I was going by https://github.com/marqs85/ossc/commit/4a686d … 2f1c1cd3ab1d5d9 but maybe I misunderstood.

I'm pretty sure we are talking about the same mode . 1600x400 is the sampling value, with line3x being applied on y axis giving an effective 1600x1200.

I should have said 1600x400 with line3 x (y axis), instead of just saying 1600x1200 . I have corrected my previous post . Sorry for adding to the confusion .

Anyway, since input is at 640x400@70Hz, sampled as 1600x400@70Hz and line-multiplied on y axis to give 1600x1200@70Hz at 189MHz pixel clock (OSSC cannot decouple input and output timings to apply reduced blanking) which is out of spec, it does not work reliably on my OSSC and likely others

EDIT: If 1600x400 as passthrough or 1600x400 with line2x (effective 1600x800 output) on y axis were possible, as mentioned in my previous post, it would be interesting .

Reply 235 of 295, by bestemor

User metadata
Rank Oldbie
Rank
Oldbie

That voodoo+OSSC looks sooo much better than the directly scaled one ! 😁
But buying that, would cost me as much as new monitor it seems, after taxes and shipping and... and... etc... ouch

On another note, any thoughts on this DVI EDID emulator ?
https://www.lindy.co.uk/audio-video-c2/edid-d … -displays-p7179

It has a lot of stored presets*, but I can't see a way to _force_ 1 of them ?
(* granted it's missing the 640x400, but...)

I suppose the EXtrons are better suited for our magical mystery tour, or... ?
(not quite sure what to look for/how to 'program' these things - me just want to plug and play!)

Reply 236 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
bestemor wrote on 2020-09-04, 21:25:
That voodoo+OSSC looks sooo much better than the directly scaled one ! 😁 But buying that, would cost me as much as new monitor […]
Show full quote

That voodoo+OSSC looks sooo much better than the directly scaled one ! 😁
But buying that, would cost me as much as new monitor it seems, after taxes and shipping and... and... etc... ouch

On another note, any thoughts on this DVI EDID emulator ?
https://www.lindy.co.uk/audio-video-c2/edid-d … -displays-p7179

It has a lot of stored presets*, but I can't see a way to _force_ 1 of them ?
(* granted it's missing the 640x400, but...)

I suppose the EXtrons are better suited for our magical mystery tour, or... ?
(not quite sure what to look for/how to 'program' these things - me just want to plug and play!)

If you have not done so already, I suggest you test your monitor to see if
a) >60Hz works in Windows using a modern PC over VGA and DVI/HDMI .
b) Try vsynctester.com to see if frames are skipped at 70Hz and 75Hz .

That EDID emulator only allows you to clone the EDID of an existing monitor . For it to be useful, you would need one that can be programmed with a custom EDID .

If you want to keep costs low, use your existing monitor and keep thing simple, I would suggest the Extron Extron RGB-DVI 300 (or the HDMi variant). All you lose is the 70Hz capability under DOS and Windows (and you can always use a cheap HDMI switch to bypass the Extron and get a direct digital connection to the monitor for use in Windows at 70 or 75Hz). It may not be as sharp as an OSSC, but it is one of the better scalers on the market .

EDIT : Clarified which Extron product I meant .

Last edited by darry on 2020-09-04, 22:27. Edited 1 time in total.

Reply 238 of 295, by darry

User metadata
Rank l33t++
Rank
l33t++
bestemor wrote on 2020-09-04, 22:03:

"...suggest the Extron"
I take it you now meant the EDID emulator (and not that scaler mentioned earlier) ?

Sorry for the confusion. I meant getting the Extron RGB-DVI 300 as a scaler . https://media.extron.com/public/download/file … 8-1407-01_D.pdf