VOGONS


First post, by Kyle

User metadata
Rank Newbie
Rank
Newbie

I've been a through a few threads regarding this and am looking for input before buying another card. I am looking for

- PCI
- DVI
- Passively cooled
- DOS compatibility

I picked up a FX 5200. It is working fine but the output is a tad soft. I saw recommendations for a Radeon 7000. Any other cards I should consider?

Reply 1 of 16, by Jo22

User metadata
Rank l33t++
Rank
l33t++

Hi Kyle! I heard DVI is limited to 60Hz.. And some VGA games do run at 72 or 75Hz, I believe.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 2 of 16, by Rawit

User metadata
Rank Oldbie
Rank
Oldbie

I'm looking for such a thing myself. I also used a FX5200, but the output is soft as it scales everything to 1024x768. Highly compatible that way with a bunch of displays, but most of the time the scaled output gets scaled again which is ugly. Seems to be limited to 60Hz under DOS.

The Matrox G200 with DFP module is another card I'm trying to get running, but so far I'm not getting anything to display under DOS. It looks like it ignores everything above 60Hz.

The S3 Savage4 with DFP works pretty well, but normal DOS games are displayed as 640x480@60hz, with black bars added at the top and bottom. I'm not sure if my display (a TV at the moment) indentifies it incorrectly or if it's forced by the card.

DVI supports 70Hz, it's just that a lot of chips used for digital out are limited to 60Hz. Most PanelLink chips are 1280x1024 / 60Hz max. DFP stands for Digital Flat Panel and can be converted to DVI by a passive adaptor.

YouTube

Reply 3 of 16, by Jo22

User metadata
Rank l33t++
Rank
l33t++

Um, I don't mean to discourage you guys,
but why don't you use some good, old VGA CRT instead ?

If space is a problem, a cheap LED projector may solve that issue.
IMHO, these are often equipped with composite and VGA ports.

And they run in the correct abysmal low resolution modes (~320x240, sometimes 640x480).
Sample: https://www.youtube.com/watch?v=cJ3yduw0z8E

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 4 of 16, by Kyle

User metadata
Rank Newbie
Rank
Newbie

I have several CRT and a legit projector that takes VGA. If the answer is that there is no answer that is okay. I would just like to get input from someone who has fought this battle before me. There were a few scattered posts on this but nothing definitive that I found.

My monitor has an 'edge enhancement' option that makes the 5200 look a good deal better. It works much better than adjusting the sharpness. At the end of the day I am not opposed to running VGA either. I would just prefer a digital signal. I can see the analog quirks/interference that are not there on digital.

Any input from ATI/Matrox owners?

Reply 5 of 16, by Jo22

User metadata
Rank l33t++
Rank
l33t++

Ah okay, that makes sense. But I'd like to note that a digital connection isn't always better.
I once had a SyncMaster 27" screen with both HDMI and VGA inputs.
Turned out HDMI was more blurred than VGA. Why ? I don't know. People on the net said,
that digital connections sometimes are the victims of "picture enhancment features", like scalers and such.
Don't know if that was the reason in my case, but it sounds plausible (the same panel was used in a TV, too).

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 6 of 16, by yawetaG

User metadata
Rank Oldbie
Rank
Oldbie
Jo22 wrote:
Ah okay, that makes sense. But I'd like to note that a digital connection isn't always better. I once had a SyncMaster 27" scree […]
Show full quote

Ah okay, that makes sense. But I'd like to note that a digital connection isn't always better.
I once had a SyncMaster 27" screen with both HDMI and VGA inputs.
Turned out HDMI was more blurred than VGA. Why ? I don't know. People on the net said,
that digital connections sometimes are the victims of "picture enhancment features", like scalers and such.
Don't know if that was the reason in my case, but it sounds plausible (the same panel was used in a TV, too).

Flat panel TV's also have that problem. On my Samsung TV, sending analog over component video is upscaled and therefore very bad quality, while anything that goes via SCART gets handled fine.

Reply 7 of 16, by Rawit

User metadata
Rank Oldbie
Rank
Oldbie

Every signal that isn't the same resolution as the native resolution of the screen gets upscaled with flat panels. The loss of quality is because of something else, perhaps deinterlacing or a denoise filter.

In my case the DFP/DVI out of my Savage4 is a lot better than its VGA or Matrox VGA for example. Even a Gefen VGA to HDMI scaler can't beat it.

YouTube

Reply 8 of 16, by Jo22

User metadata
Rank l33t++
Rank
l33t++
yawetaG wrote:

Flat panel TV's also have that problem. On my Samsung TV, sending analog over component video is upscaled and therefore very bad quality, while anything that goes via SCART gets handled fine.

Yeah, I noticed this, too. My Samsung SyncMaster had a sister model, which was a full-fledged TV set.
Perhaps the SCART connection was better because it was properly shielded or because Component was processed by an external chip.
Or if you used a vintage console, maybe it was because of the low-res signals. Modern TVs are often unaware about old 240p (rather 243p) and 288p signals.
They are technically interlaced signals, but carry progressive picture information. Or in simple words, the odd/even pictures look the same.

Anway, I'm not good at explaining things, I'm afraid. 😅 There's a Wikipedia entry telling about it.
I'm just mentioning it, since the whole topic is constantly popping up in the retro gaming community (esp. NES and Atari 2600).
https://en.wikipedia.org/wiki/Low-definition_television
http://www.hdretrovision.com/240p/

Rawit wrote:

Every signal that isn't the same resolution as the native resolution of the screen gets upscaled with flat panels. The loss of quality is because of something else, perhaps deinterlacing or a denoise filter.

True, but in my case it was something else I believe. I ran the screen in native resolution (1920x1080)
and there, VGA was threaded as a serious, business "PC" input while the HDMI signal was considered a consumer's input.
And since DVI and HDMI are relatives, this could also affect DVI inputs.

Speaking of Full HD, I'm surprised how good VGA is when using quality cables.
Normally, someone should think the difference between it and digital video is more apparent.

Rawit wrote:

In my case the DFP/DVI out of my Savage4 is a lot better than its VGA or Matrox VGA for example. Even a Gefen VGA to HDMI scaler can't beat it.

That reminds me of my old S3 UniChrome card. It had excellent VGA output, but wasn't really
Vista/7 compatible (no Aero Glass) and couldn't accelerate Flash.
Which is a shame, because the card was otherwise awesome in XP. DVD playback was great due to
quality Inverse-Consinus-Transformation and other hardware assisted de-noise/de-blocking features.
I used PowerDVD 6 Deluxe, by the way. ^^

Edit: Parts of the text corrected. Really, I should stop writting stuff in the middle of the night, hah. 😅
Edit: Link addded.

Last edited by Jo22 on 2017-08-17, 00:48. Edited 4 times in total.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 9 of 16, by Jo22

User metadata
Rank l33t++
Rank
l33t++

Oh, and here's a little something I used to use for testing. The pattern comes in handy for testing video quality.
If the connection is poor, you will see some noise and flickering on screen.

Attachments

  • flicker_test.jpg
    Filename
    flicker_test.jpg
    File size
    12.86 KiB
    Views
    1768 views
    File comment
    Test pattern.
    File license
    Fair use/fair dealing exception

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 10 of 16, by lvader

User metadata
Rank Member
Rank
Member

Voodoo cards with 2D capabilities (Banshee, Voodoo 3etc) have very good analogue VGA output and compatibility, but if you are using an LCD screen I found the need to continuously correct geomety an annoyance. I tried an FX5200 DVI but the output is very soft. Since then I've switched to a Voodoo 5500 DVI and it works great, it does 640x400@70hz no problems and looks great.

Reply 11 of 16, by Jepael

User metadata
Rank Oldbie
Rank
Oldbie
lvader wrote:

Since then I've switched to a Voodoo 5500 DVI and it works great, it does 640x400@70hz no problems and looks great.

That's weird as 400-line 70Hz mode should be 720x400, not 640x400. Does the video card scale it?

Reply 12 of 16, by boxpressed

User metadata
Rank Oldbie
Rank
Oldbie

Does a PCI card with DVI output exist that could work in a 486 (with PCI) motherboard? Or would it require too much power, or run into other complications?

I have an Avermedia Game Broadcaster HD capture card that reports all VGA input as "out of range," unfortunately.

Reply 13 of 16, by lvader

User metadata
Rank Member
Rank
Member
Jepael wrote:
lvader wrote:

Since then I've switched to a Voodoo 5500 DVI and it works great, it does 640x400@70hz no problems and looks great.

That's weird as 400-line 70Hz mode should be 720x400, not 640x400. Does the video card scale it?

I believe the analgue output is 720x400, DVI output is 640x400 whixh seems more correct to me as its 2x 320x200

Reply 14 of 16, by Rawit

User metadata
Rank Oldbie
Rank
Oldbie

720x400 is normally text mode. Mode 13h etc should be 640x400, but most flatpanels misidentify the signal. My Gefen scaler identifies the signal coming from an Atari Falcon as 740x400 when it's set to 640x400 mono, it does the same for DOS games.

YouTube

Reply 15 of 16, by Jepael

User metadata
Rank Oldbie
Rank
Oldbie
Rawit wrote:

720x400 is normally text mode. Mode 13h etc should be 640x400, but most flatpanels misidentify the signal. My Gefen scaler identifies the signal coming from an Atari Falcon as 740x400 when it's set to 640x400 mono, it does the same for DOS games.

That is right, and the reason for it is that sync timings for these two VGA analog signals look identical to monitor so monitor cannot know if it is 800-pixel wide (640 active) or 900-pixel wide (720 active) signal, or rather, is it generated with 25.175 or 28.322 MHz pixel clock. That's why some TFT monitors allow switching between 640 and 720 wide modes when in analog VGA mode, so user can match the monitor to display correctly what is sent to it.

DVI being digital, it has pixel clock and the link encodes which pixels are active and which pixels are blanking area, so it knows the difference.