VOGONS


Reply 280 of 344, by matti157

User metadata
Rank Member
Rank
Member
Prez wrote on 2020-05-28, 09:30:
Hi all ! I just received this one, very cheap, only $10, and it works so great !! https://www.amazon.fr/gp/product/B07VW6VK66/r […]
Show full quote

Hi all !
I just received this one, very cheap, only $10, and it works so great !!
https://www.amazon.fr/gp/product/B07VW6VK66/r … 0?ie=UTF8&psc=1

From Windows 98 screens in 1024x768 with a perfect image to MS-DOS gaming in VGA, everything is fine !

Here is a first result, i think its perfect for this price :
https://www.youtube.com/watch?v=R8e6Nxl6VU8

I just used a cheap HDMI H264 encoder ($30), the video quality is even better than that you can watch on youtube of course !

Best regards
Philippe Dubois

Is it able to display the bios and bootstrap phase?

Reply 281 of 344, by dekkit

User metadata
Rank Member
Rank
Member
iraito wrote on 2023-01-25, 09:02:

Yep it does just that and take into account that the pictures you see posted are passing through a generic capture card that's a bit blurry and has a horrible red channel resolution, the results are incredibly crisp and you can center the picture in pretty much all cases, i was also able to upscale a res of 640x480 to pretty good results, it's honestly one of the best methods for this kind of stuff, it's opensource, readly available and cheap.

Out of interest what is your setup?
And in particular what graphic card are you using?

Reply 282 of 344, by iraito

User metadata
Rank Member
Rank
Member
dekkit wrote on 2023-01-28, 12:06:
iraito wrote on 2023-01-25, 09:02:

Yep it does just that and take into account that the pictures you see posted are passing through a generic capture card that's a bit blurry and has a horrible red channel resolution, the results are incredibly crisp and you can center the picture in pretty much all cases, i was also able to upscale a res of 640x480 to pretty good results, it's honestly one of the best methods for this kind of stuff, it's opensource, readly available and cheap.

Out of interest what is your setup?
And in particular what graphic card are you using?

My setup is: Retro PC ---> VGA splitter ---> VGA to VGA2SCART sync converter ---> GBS-C ---> VGA with sync converter and upscale to HDMI ---> HDMI to capture card --->OBS

That's only for DOS and 640x480 (from 800x600 i send the signal converted to HDMI directly to the capture card), You can see the GPUs i use in my sign, the Win95 one uses a matrox G200 and a voodoo2.

uRj9ajU.pngqZbxQbV.png
If you wanna check a blue ball playing retro PC games
MIDI Devices: RA-50 (modded to MT-32) SC-55

Reply 283 of 344, by gen_angry

User metadata
Rank Newbie
Rank
Newbie

So I have a bit of a strange issue.

I have a 4 port KVM that's connected to all three of my retro rigs. From there, I have it going to a passive VGA splitter. From there, it's split to output to my monitor and my atlona (which then goes to my capture card).

So it's like this:

(rigs) ----> (KVM) ---> (VGA splitter) ---> (Short VGA cable) ---> (Atlona) ---> (capture card)
---> (Monitor)

When I try to power on, nothing displays until I boot into windows (if it's set to a locked resolution). On my 98 rig, it goes out of range immediately.

However, when I unplug the atlona, it works fine from power on till desktop. It works fine through the atlona if I unplug my monitor. It also works on both if I start up with one of the two plugged in then plug in the other after power on. This leads me to think it's a EDID problem? The monitor and atlona are both sending data at the same time at initial power on which interferes with each other.

Normally Id just lock the resolution and leave it and let it boot but, my DOS rig can't do that, I can't use the BIOS, nor can I use a boot menu (which my XP/Win7 rig has). So that doesn't work.

So... is there anything I can actually do about it? Just trying to come up with ideas for a solution. Cut a pin or two from the little VGA cable so only the monitor returns info?

Thanks 😀

Reply 284 of 344, by dekkit

User metadata
Rank Member
Rank
Member
gen_angry wrote on 2023-01-28, 22:27:

So I have a bit of a strange issue.

I have a 4 port KVM that's connected to all three of my retro rigs. From there, I have it going to a passive VGA splitter.

Could this also be an issue from using a 'passive' non-powered vga splitter?

Also re: GBS Control for me, I couldn't quite get it to where I needed to be ... Ie the whole picture comfortably fitting on the screen. It seems so close, when i scale the horizontal further to see more - I get blurs/jitter/ image corruption.

Ive updated my earlier post with a pic of the MS-DOS app 'Edit' screen.

dekkit wrote on 2023-01-26, 10:24:

UPDATE #2

This is likely a combination my graphics card/chip ( GPU is a CHIPS F65540), lcd monitor, and gbs control.

Im not running any gfx driver in for ms dos so that may also be a factor with my video card.

Reply 285 of 344, by gen_angry

User metadata
Rank Newbie
Rank
Newbie
dekkit wrote on 2023-01-29, 02:30:

Could this also be an issue from using a 'passive' non-powered vga splitter?

I thought so too initially but it works as long as only one device is connected on power on. If it were the splitter, would it not work then?

Reply 286 of 344, by iraito

User metadata
Rank Member
Rank
Member

With my matrox i can calibrate the monitor from windows and get it centered, in DOS i just had to set it with GBS, you might have to further mod the GBS to make it more stable or maybe you can fix everything by tweaking the GPU.

uRj9ajU.pngqZbxQbV.png
If you wanna check a blue ball playing retro PC games
MIDI Devices: RA-50 (modded to MT-32) SC-55

Reply 287 of 344, by dekkit

User metadata
Rank Member
Rank
Member
iraito wrote on 2023-01-29, 07:42:

With my matrox i can calibrate the monitor from windows and get it centered, in DOS i just had to set it with GBS, you might have to further mod the GBS to make it more stable or maybe you can fix everything by tweaking the GPU.

Next time you power on your setup for MS DOS - would you mind taking a screen cap of your gbs control settings? (particularly anything advanced or if there is option buried deep that you needed to adjust)

I just want to rule out if there if there is anything i've missed. I'll continue exploring again once I redo my Gbs control mod (my mishmash of extra mods on it has finally pushed my wiring to the limits and something snapped! so will need to fix next time i've got the soldering iron out) . Also thinking of trying a chain of various devices in different orders (cheap vga to hdmi etc then converting into gbs control to see if i can restore the ratio using move/scale settings).

Worth mentioning that previous testing of Pentium 90 and Pentium 120 Toshiba Laptops and desktop P2-233 (s3 Virge) worked really well on my LCD monitors (no image cut off etc) without needing anything extra at all in MS dos mode - playing doom/duke 3d. So upscaling these in gbs-control i can see would work really well. That said, there is certainly something super quirky going on with this much older video chip on this 486 (CHIPS F65540) - maybe the refresh rates are varying way too much/ or are too non-standard - not really sure why the difference.

Vertical / height is fine (on just about all up scaling devices i've used/tested)
Horizontal / width, however, is confusing the crap out of most scalers - the Extron 300 or 300a seem a better fit for this older 486 at the moment (but even that needs readjustment for some games)

I'll continue experimenting.

gen_angry wrote on 2023-01-29, 02:55:

I thought so too initially but it works as long as only one device is connected on power on. If it were the splitter, would it not work then?

As I understand it, having a passive splitter cable will degrade /likely lower the signal strength (do the colors go a little darker when both are connected? ). An active vga video splitter will likely have circuitry to handle the conditions you mention and also boost the signal - i'd give one a go and keep the receipt 😉

Reply 288 of 344, by gen_angry

User metadata
Rank Newbie
Rank
Newbie

I remembered that older VGA monitors before EDID often had fewer pins. So, on a whim before buying anything new since the short VGA patch cable was super cheap anyways (which if this didn't work, I would have picked up another alongside the suggested active splitter) - I found a pinout of the VGA spec and tore out pins 4, 9, 12, and 15 with a pair of needlenose to 'neuter' the EDID response of the Atlona. (Also tore out pin 7 by mistake misjudging the middle rows alignment but that's just a ground pin that looks tied to the others anyways).

It works now. Both outputs display on power on and looks great on all three rigs (a tad darker, yes but only very slightly and easily corrected even just by monitor adjustment). May even be able to fix it by getting a power brick for the KVM.

A bit crude but Im not complaining. 😀

If anyone else has a similar setup to mine and don't mind sacrificing a cheap patch VGA cable (def do NOT do this to a monitor's built in plug), give that a try. Just be careful you've got the right pins.

Reply 290 of 344, by dekkit

User metadata
Rank Member
Rank
Member

...digging deeper into the 486 'Chips' video card - it has become clearer that it video output isn't to spec.

For MS DOS text mode (assuming 720x400):

Actual (Video= 'CHIPS F65548')
33.4 kHz
69 Hz

Expected (based on what I could find in few monitor user manuals)
31.469 kHz
70.87 Hz

FYI - Here is what my Samsung 940N LCD is reporting - even though it is actually showing the image really well (with no scaler inbetween)

20230205_134024_HDR.jpg
Filename
20230205_134024_HDR.jpg
File size
1.05 MiB
Views
1558 views
File license
CC-BY-4.0

So its important to keep this in mind with any reports (including my own!) that a certain upscaler or even LCD Monitor combination 'didn't work properly' .

Its very likely the lcd screen / upscalers have a tolerance threshold that a non standard signal falls well outside of (hence the need to mess with manual adjustments). CRT screens are not as fussy.

I'm going to see if I can find any ways to adjust / correct/ clean the output kHz and hz on the video card itself or if there is a circuits that may help before I try and scale the image ( if such a thing exists) before plugging it back into a gbs or extron.

UPDATE #1
Its definaltely the video card in this 486. The default mode is out of spec refresh rates (confirmed in the video ic chip datasheet) - with a few tweaking options found in the win95 drivers.

It was originally configured to support both a directly connected digital LCD panel and an external crt. Now looking how to fix this in dos.

GBS Control via RGBs looks like it maybe the cheap winner after all.

UPDATE #2
Now that I have resolved the weird vga refresh issue in dos, I decided to re-test my scalers.

What is interesting is that my cheap aliexpress vga to hdmi converter actually performed worst than it did prior - specifically for resolution 720x400 which is typically for boot screens It resulted in a blank screen until I loaded a game. Still cropped some parts of the screen.

Gbs control was a lot easier to work with now and even worked ok when feedingVGA HV into it but will test further.

The extron 300 - worked even better than before- once I loaded a game that used 720x400, selected auto adjust in the 300 menu - it snapped the picture into the centre with correct aspect ratio and retained setting after a reboot.

Last edited by dekkit on 2023-02-19, 12:51. Edited 2 times in total.

Reply 291 of 344, by clb

User metadata
Rank Member
Rank
Member

Hi all,

reading through this long thread of challenges, it makes me happy to present a new option for digital video on PC, for both viewing on a flat panel or for video capture purposes. Check out CRT Terminator Digital VGA Feature Card ISA DV1000 for more details.

Many of the issues discussed in this thread (resolution inconsistencies, blurriness, pixel noise, image border framing, 640 vs 720 pixels detection, 60hz vs 70hz framerate, incorrect aspect ratio, jailbars, and many other problems) are avoided completely with CRT Terminator, since the VGA adapter DAC is bypassed. That makes me cautiously optimistic that this might prove to be the "ultimate solution" that many people have been looking for here! 😀

Reply 292 of 344, by christal87

User metadata
Rank Newbie
Rank
Newbie

Fascinated that there were small flat panel displays hooked up to feature connectors back in the day for a variety of industrial purposes, the same idea came to my mind from time to time. I would have needed a lot of time and research (also testing) to cross reference feature conn's capabilities/pinouts on multiple cards, brush up on 8-bit video DACs, scaling, DVI conversion and so on. So I chased it away. It's pretty complex to get it off the ground and make it somewhat universally compatible for my level of knowledge. Also I was pondering over if it would be usable on a wide variety of cards. Nice to see someone thinks alike and making this a reality! I see a GCT brand Type-C connector on one of the pics and an unpopulated quad flat package. Is it only for flashing the hw or a hint at a future Displayport handling capability? I'm more than satisfied with a DVI-D output, but if that's the case it would be pretty awesome!
Kudos & kiitos for this project! 😀

clb wrote on 2023-02-05, 23:13:

Hi all,

reading through this long thread of challenges, it makes me happy to present a new option for digital video on PC, for both viewing on a flat panel or for video capture purposes. Check out CRT Terminator Digital VGA Feature Card ISA DV1000 for more details.

Many of the issues discussed in this thread (resolution inconsistencies, blurriness, pixel noise, image border framing, 640 vs 720 pixels detection, 60hz vs 70hz framerate, incorrect aspect ratio, jailbars, and many other problems) are avoided completely with CRT Terminator, since the VGA adapter DAC is bypassed. That makes me cautiously optimistic that this might prove to be the "ultimate solution" that many people have been looking for here! 😀

Reply 293 of 344, by clb

User metadata
Rank Member
Rank
Member
christal87 wrote on 2023-02-06, 16:52:

Also I was pondering over if it would be usable on a wide variety of cards. Nice to see someone thinks alike and making this a reality!

Thanks!

We are aiming for universality. As long as the digital video signal integrity is intact, we want to support it. In the past two years I have accumulated several dozens of graphics cards from eBay for testing, and have had to start tell myself to stop getting even more. It makes me chuckle to see how different cards interpreted the standard "wrong" (I write wrong in quotes since there really never was any proper standard beside the pinout, but it is just a dump of raw video signal from each board), must have been a nightmare for MPEG accelerators etc, but awesome fun detective work for a retro tinkerer.

I have a ATI 28800-6 card from 1992 that has such a poor RAMDAC that the VGA output is really noisy and shimmery even with a CRT, but from its Feature Connector bus, with a few quirk handling modes, one can get a crystal clear image, bringing the card back to life again for a round of Alley Cat!

christal87 wrote on 2023-02-06, 16:52:

I see a GCT brand Type-C connector on one of the pics and an unpopulated quad flat package. Is it only for flashing the hw or a hint at a future Displayport handling capability? I'm more than satisfied with a DVI-D output, but if that's the case it would be pretty awesome!

The USB-C connector is for flashing updates to the hardware. In that proto card that subsystem was not populated fully. In final boards we want to have a system for providing firmware upgrades in case there will be bugs found or anything else. I.e. "insurance" against unknowns.

christal87 wrote on 2023-02-06, 16:52:

Kudos & kiitos for this project! 😀

Kiitos!

Reply 294 of 344, by christal87

User metadata
Rank Newbie
Rank
Newbie
clb wrote on 2023-02-06, 17:21:

We are aiming for universality. As long as the digital video signal integrity is intact, we want to support it. In the past two years I have accumulated several dozens of graphics cards from eBay for testing, and have had to start tell myself to stop getting even more. It makes me chuckle to see how different cards interpreted the standard "wrong" (I write wrong in quotes since there really never was any proper standard beside the pinout, but it is just a dump of raw video signal from each board), must have been a nightmare for MPEG accelerators etc, but awesome fun detective work for a retro tinkerer.

I have a ATI 28800-6 card from 1992 that has such a poor RAMDAC that the VGA output is really noisy and shimmery even with a CRT, but from its Feature Connector bus, with a few quirk handling modes, one can get a crystal clear image, bringing the card back to life again for a round of Alley Cat!

I can totally agree with the standard being implemented very loosely by a lot of design houses back when. I had my fair share of "raising eyebrows" moment with it. I guess plain simple CRT's (especially without multisync) had no problems with the signal being out of spec a bit. Turn a pot here and there and it was OK. Dropping precision electronics as a way of cuttting costs to position a specific VGA card on the market is another culprit. The VESA "specs" state: "RGB signals take values in a continuous (analog) voltage range from +0V to +0.7V ". A few years ago I had a Voodoo2 with a RAMDAC issue. The green output driver died, because its signal was very weak. At that time, before the fix for curiosity's sake I measured all color signals and compared them to an S3 card. My conclusion was that 3DFx (TI, GENDAC, etc. DACs) conformed to it, but the S3 card was out of spec with it's ~1.6V max levels (86C368 w/ integrated RAMDAC). On top of that more modern monitors had some kind of 2-step gain switch setting (0.7/1V input). At that moment I realized what had been mysteriously identified as "a very bright, crisp image" back in the days by my friends was because S3 was "overdriving" the outputs or something. Beside it's cheapness at that time owning an S3+V2 combo I guess somewhat made up for the V2 output switch circuit signal losses when glide was inactive. Also 3DFx and STB cards had a lot of PCB layers, but none of those companies ever tried to use 50ohms controlled impedance traces to avoid any reflections and noise on that analog switch circuit. 😀

Reply 295 of 344, by clb

User metadata
Rank Member
Rank
Member
christal87 wrote on 2023-02-06, 19:40:

... , but the S3 card was out of spec with it's ~1.6V max levels (86C368 w/ integrated RAMDAC).

Voltage levels being all over the place is certainly something we have seen as well 😀

Reply 296 of 344, by clb

User metadata
Rank Member
Rank
Member

In the topic of this thread, here is a comparison of some text from OSSC 800x600 VGA->digital conversion versus CRT Terminator viewing the same content.

OSSC image has varying softness across the scanline, some columns of text are sharper and others are softer. The image is shifted to the right and cut off, there is a column of black pixels on the left. There is slight banding (in some other images I find jailbars, though this particular one avoids them quite well).

CRT Terminator on the other hand is crisp and pixel perfect in upscaling 800x600 -> 1600x1200 as one might expect.

EDIT: Replaced the OSSC capture with a new one to correct my previous mistake of not manually configuring OSSC sampling settings. For the SEA Image Viewer application, I needed to change H. sample rate in Advanced Settings from 1056 to 1048, H. back porch from 88 to 101, and Sampling Phase from 180 degrees to 348 degrees. (the values 1048 and 101 were analyzed from the input video stream passed into CRT Terminator)

Attachments

Last edited by clb on 2023-02-07, 18:20. Edited 2 times in total.

Reply 297 of 344, by maxtherabbit

User metadata
Rank l33t
Rank
l33t
clb wrote on 2023-02-06, 23:52:

OSSC image has varying softness across the scanline, some columns of text are sharper and others are softer. The image is shifted to the right and cut off, there is a column of black pixels on the left. There is slight banding (in some other images I find jailbars, though this particular one avoids them quite well).

all of that can be eliminated by properly adjusting your sampling settings and setting phase in the OSSC, it is not a plug-n-play device and should not be treated as such

Reply 299 of 344, by clb

User metadata
Rank Member
Rank
Member

To clarify my previous mistake, I took time to configure OSSC sampling to find the best parameters for the same picture. I deleted the previous capture to avoid giving anyone false impressions, and replaced it with an improved capture, which has much better quality.

Verified that the video frame sampling parameters in that capture are correct down to exact clock cycles for this particular video mode (horizontal: fpo:19, sync: 128, bpo: 101, active: 800 and vertical: fpo: 2, sync: 2, bpo: 23, active: 600), and adjusted the other settings so that they give the best result visually. (Maybe the OSSC capture could be still improved further, this was the best I was able to find browsing the different menus)

Thanks maxtherabbit for keeping me from making an ignorant comparison!