VOGONS


VGA CRT with a R9 290x

Topic actions

First post, by frisky dingo

User metadata
Rank Member
Rank
Member

Ghosts

Last edited by frisky dingo on 2015-06-26, 03:54. Edited 1 time in total.

Reply 1 of 27, by jwt27

User metadata
Rank Oldbie
Rank
Oldbie

I'd say boycott AMD for removing analog video and get a Geforce instead!

For converters, see if you can find one that supports hdmi 2.0 or displayport 1.4. These standards support a 600MHz pixel clock, so I would assume a VGA DAC with these specs to be sharper. An R9-290x only supports hdmi 1.4 and dp 1.3 though, which go up to 340MHz (not even enough for 2048x1536 at 75Hz).

Reply 2 of 27, by alexanrs

User metadata
Rank l33t
Rank
l33t

I'd be more worried about lag than input quality. The easiest way out would be exchanging your ATI card with an older equivalent one or an equivalent Geforce.

Reply 3 of 27, by calvin

User metadata
Rank Member
Rank
Member

You're likelier to get VGA out of DisplayPort.

Even then, VGA is essentially obsolete. Monitors are out with resolutions higher than VGA can provide, and you'll have to deal with noise. The next-gen Intel IGP is removing VGA, and most high-end GPUs have followed suit.

The only reason to use VGA now is for projectors or retro hardware, which is a niche and better served with other retro hardware.

2xP2 450, 512 MB SDR, GeForce DDR, Asus P2B-D, Windows 2000
P3 866, 512 MB RDRAM, Radeon X1650, Dell Dimension XPS B866, Windows 7
M2 @ 250 MHz, 64 MB SDE, SiS5598, Compaq Presario 2286, Windows 98

Reply 5 of 27, by obobskivich

User metadata
Rank l33t
Rank
l33t
frisky dingo wrote:

I have a newer system (see sig) and the 290's lack vga and can't use a passive dvi to vga adapter. 😵 The system has a gt220 witch has vga and my crt looks great with the system. But the gt220 sucks for gaming.
Does anyone have any ideas for a way I can use a crt on my 290's? The R9 290s have HDMI display port and 2 dvi-d ports. The best thing I could think of would be to get a converter of sorts. But I'm worried about video quality from those. 😖 Has anyone here used a DVI-D, displayport or HDMI to vga convertor?

DP to VGA adapter, make sure it has sufficient resolution support for your monitor. I would avoid HDMI as those converters are usually expensive, and primarily target inter-compatibility for things like conference rooms or AV systems (its basically a legacy support path for analog devices like DVD players and whatnot).

jwt27 wrote:

I'd say boycott AMD for removing analog video and get a Geforce instead!

Have you actually tried VGA output on any recent nVidia card? (Kepler or newer) The quality is atrocious, and the VGA-mode performance is equally awful. I have cards from the mid-90s that produce a sharper picture and score (significantly) better in VGA benchmarks like 3D Bench.

For converters, see if you can find one that supports hdmi 2.0 or displayport 1.4. These standards support a 600MHz pixel clock, so I would assume a VGA DAC with these specs to be sharper. An R9-290x only supports hdmi 1.4 and dp 1.3 though, which go up to 340MHz (not even enough for 2048x1536 at 75Hz).

There is no "DisplayPort 1.4" - the R9 290 series supports DP1.2, which differs from the newest (1.3; which was only ratified last September, and nothing (including the brand-new Titan X) supports it) only in that it doesn't support 8K displays, or stereo 3D 4K (neither of which are generally available to the public). 290X does support 4K (or >120Hz 1080p) from DP and HDMI (as well as a pair of dual-link DVI outputs); 2048x1536 is not a problem (via digital at least) on any output. However you will not likely find a stand-alone VGA adapter with that capability; most of them generally support something up to around 1080p.

jwt27 wrote:

Fun fact, VGA is analog, there is no resolution limit.

Theoretically true, but not in practice.

Reply 6 of 27, by jwt27

User metadata
Rank Oldbie
Rank
Oldbie
obobskivich wrote:

Have you actually tried VGA output on any recent nVidia card? (Kepler or newer) The quality is atrocious, and the VGA-mode performance is equally awful. I have cards from the mid-90s that produce a sharper picture and score (significantly) better in VGA benchmarks like 3D Bench.

I'm looking at the VGA output from a GTX780 right now. Seems okay to me. It's not quite as sharp as I'd like, since it judders horizontally ever so slightly (still not sure if that's the card or the monitor). Compared to a Geforce 7800, it's an improvement though.
A friend of mine has a GTX970 and it does some nasty scaling in VESA and text modes.

obobskivich wrote:

There also is no "DisplayPort 1.4" - the R9 290 series supports DP1.2

Oops, yeah, I meant to say DP 1.2 there.

obobskivich wrote:
jwt27 wrote:

Fun fact, VGA is analog, there is no resolution limit.

Theoretically true, but not in practice.

Tell me about it 🙁

Reply 7 of 27, by obobskivich

User metadata
Rank l33t
Rank
l33t
jwt27 wrote:

I'm looking at the VGA output from a GTX780 right now. Seems okay to me. It's not quite as sharp as I'd like, since it judders horizontally ever so slightly (still not sure if that's the card or the monitor). Compared to a Geforce 7800, it's an improvement though.
A friend of mine has a GTX970 and it does some nasty scaling in VESA and text modes.

I have both a Fermi and Kepler card, and both of their VGA outputs are awful compared to my 3DLabs, GeForce FX/6/7/8 (if I'm being completely honest, GeForce 2 Ultra also), and Radeon 7/9/X/X1k cards - blurred, over-saturated, text-mode is blocky/chunky, and higher resolutions can have noise and other undesirable features. It's just a lost cause imho. To say nothing of how awful they perform in VGA-mode benchmarks, and how bad GDI acceleration looks. They both do odd things with scaling as well (afaik nVidia is just recycling the analog SIP block in NVIO over and over again to have some "hold over AMD's head" feature, without any real concern for quality or compatibility).

Honestly I don't see anything wrong with VGA being dead as a doornail these days - it's not like 2048x1536 resolution 20-22" CRTs were ever paragons of sharpness (I've owned, used, etc quite a few - generally it's better to run them at 1280x960 or 1600x1200 for desktop use, but video is usually okay at higher resolutions); high resolution LCDs by contrast can be absolutely fantastic at full-resolution in terms of clarity (owned and used quite a few of those too).

Oops, yeah, I meant to say DP 1.2 there.

It still doesn't really matter - DP1.2 supports resolutions much higher than DL-DVI, and either of them would have no problems at all with 2048x1536.

Tell me about it 🙁

Speaking theoretically, Dual-link DVI also has no maximum resolution limits. But again, in practice, the situation is very different.

Reply 8 of 27, by jwt27

User metadata
Rank Oldbie
Rank
Oldbie
obobskivich wrote:

I have both a Fermi and Kepler card, and both of their VGA outputs are awful compared to my 3DLabs, GeForce FX/6/7/8 (if I'm being completely honest, GeForce 2 Ultra also), and Radeon 7/9/X/X1k cards - blurred, over-saturated, text-mode is blocky/chunky, and higher resolutions can have noise and other undesirable features. It's just a lost cause imho. To say nothing of how awful they perform in VGA-mode benchmarks, and how bad GDI acceleration looks. They both do odd things with scaling as well (afaik nVidia is just recycling the analog SIP block in NVIO over and over again to have some "hold over AMD's head" feature, without any real concern for quality or compatibility).

I must say I've never really taken a critical look at text mode and DOS VGA modes with this card, or ran any benchmarks. That's not what I have it for, anyway 😀
The BIOS startup screen looks sharp to me, there's no scaling going on, but I do think it's not actually emitting 720x400 (or whatever it's supposed to be) but more like 1024x768 with black borders.
My monitor has adjustable VGA signal filters to artificially enhance sharpness. Now one thing I noticed when I upgraded from a 7800 to a 780, was that I could turn filter 2 down from about 65 to 25% without any loss in sharpness (and that's a good thing, since the filter introduces some ringing artifacts).
That said I've only used this screen with the 7800 and 780 so far, so I don't really have anything else to compare with.

obobskivich wrote:

Honestly I don't see anything wrong with VGA being dead as a doornail these days - it's not like 2048x1536 resolution 20-22" CRTs were ever paragons of sharpness (I've owned, used, etc quite a few - generally it's better to run them at 1280x960 or 1600x1200 for desktop use, but video is usually okay at higher resolutions)

Ever tried a 0.21mm Eizo though...? 😀
I'm quite the opposite about resolution use, I don't mind lower resolutions with moving images (games/video), but with stationary images, higher resolutions have my preference.

Reply 9 of 27, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie
jwt27 wrote:

The BIOS startup screen looks sharp to me, there's no scaling going on, but I do think it's not actually emitting 720x400 (or whatever it's supposed to be) but more like 1024x768 with black borders.

I've noticed similar behaviour from my GTX 970. It scales everything up to the display's native resolution. Even the POST and UEFI screens get upscaled to 2560x1440. Whenever I set Windows or an old game to run at 1024x768, it gets upscaled to 1440p and then pillarboxed to correct the aspect ratio.

None of my other video cards perform that additional image processing...at least, not automatically & certainly not outside of Windows.

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 11 of 27, by frisky dingo

User metadata
Rank Member
Rank
Member

Thanks for the info everyone. I think I know what I'm going to do. Ether a DP to vga converter or a new 21:9 monitor. I just have to make up my mind. A 21:9 monitor would be amazing with my newer system but suck with my old system. But I'm betting my new system would suck with a converter too 😵
As for nvidia cards and vga, they suck, they have since the 2xx's days and they're only getting worse. I have a friend with a 780 and it's vga output is scary bad.

EDIT:
As for replacing my r9's with a nvidia card, I' have only this to say.
3.5gb 🤣

Reply 13 of 27, by SquallStrife

User metadata
Rank l33t
Rank
l33t
jwt27 wrote:
obobskivich wrote:
jwt27 wrote:

Fun fact, VGA is analog, there is no resolution limit.

Theoretically true, but not in practice.

Tell me about it 🙁

It isn't even really true in theory. The puny mini-coax leads inside VGA cables, the HD15 connector, the video card's DAC, they all have finite bandwidth.

VogonsDrivers.com | Link | News Thread

Reply 14 of 27, by obobskivich

User metadata
Rank l33t
Rank
l33t
SquallStrife wrote:

It isn't even really true in theory. The puny mini-coax leads inside VGA cables, the HD15 connector, the video card's DAC, they all have finite bandwidth.

Yes - hence, "in practice." 😊

Reply 15 of 27, by ODwilly

User metadata
Rank l33t
Rank
l33t

I use a vga CRT with my HD7850 using a DVI to VGA adapter. Odd that the R series would be incapable of that, is it just a difference in the type of DVI port on the card?

Main pc: Asus ROG 17. R9 5900HX, RTX 3070m, 16gb ddr4 3200, 1tb NVME.
Retro PC: Soyo P4S Dragon, 3gb ddr 266, 120gb Maxtor, Geforce Fx 5950 Ultra, SB Live! 5.1

Reply 16 of 27, by obobskivich

User metadata
Rank l33t
Rank
l33t
ODwilly wrote:

I use a vga CRT with my HD7850 using a DVI to VGA adapter. Odd that the R series would be incapable of that, is it just a difference in the type of DVI port on the card?

Not all of the Rx 200 cards are without analog - the 280 series (which is based on the 7900-series) support VGA output, for example. In the newer GPUs, like the 290 series, AMD has removed analog output capabilities from the GPU (on the 7850 you're not actually converting DVI (digital) to VGA (analog), you're connecting VGA thru a DVI-I connection). The DVI ports are keyed for DVI-D only to reflect this.

Reply 17 of 27, by ODwilly

User metadata
Rank l33t
Rank
l33t

Ah ok makes sense. If his card is still under warranty he could return it and get a high end HD7k card then, since they are still DX12 compliant.

Main pc: Asus ROG 17. R9 5900HX, RTX 3070m, 16gb ddr4 3200, 1tb NVME.
Retro PC: Soyo P4S Dragon, 3gb ddr 266, 120gb Maxtor, Geforce Fx 5950 Ultra, SB Live! 5.1

Reply 18 of 27, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie

Honestly I don't see anything wrong with VGA being dead as a doornail these days - it's not like 2048x1536 resolution 20-22" CRTs were ever paragons of sharpness (I've owned, used, etc quite a few - generally it's better to run them at 1280x960 or 1600x1200 for desktop use, but video is usually okay at higher resolutions); high resolution LCDs by contrast can be absolutely fantastic at full-resolution in terms of clarity (owned and used quite a few of those too).

I agree. A good LCD and DVI looks vastly superior to VGA or any CRT. The only down side LCD has is backlight bleed and viewing angle.

I would personally stay away from 21:9. 16:9 is already almost too short. 21:9 is beyond my understanding. I personally like 16:10

Reply 19 of 27, by calvin

User metadata
Rank Member
Rank
Member

The real practicality of 21:9 is for tiling window managers, where you can basically have several full-width pages edge to edge.

2xP2 450, 512 MB SDR, GeForce DDR, Asus P2B-D, Windows 2000
P3 866, 512 MB RDRAM, Radeon X1650, Dell Dimension XPS B866, Windows 7
M2 @ 250 MHz, 64 MB SDE, SiS5598, Compaq Presario 2286, Windows 98