VOGONS


First post, by Confused UngaBunga

User metadata
Rank Newbie
Rank
Newbie

In the first decade of the 3rd millenium, or somewhere around that, was any of the brands known for having clearer quality image output in dvi or vga?
Thank you.

Reply 1 of 5, by smtkr

User metadata
Rank Member
Rank
Member

DVI is digital, so no on that front (albeit, be careful with some of the earlier cards, as many of them don't support higher resolutions).

With respect to "VGA," yes, absolutely. I remember and Anandtech article during the Geforce 3 era complaining very loudly about the image quality on various brands. I'd dig through some of their old stuff and see if you can find it. From what I remember, this was almost exclusively an issue with nVidia card partners. ATI, 3DFX, etc. all had really good analogue from what I remember.

Reply 2 of 5, by Horun

User metadata
Rank l33t++
Rank
l33t++

It also depends on the monitor. My old 2494 Samsung has VGA and DVI inputs and the DVI is crisper than the VGA from say a nVidia 8800 GT. Though with VGA input you have more monitor adjustments than with the DVI input....
So not just the vid card but also the monitor must be considered...

Hate posting a reply and then have to edit it because it made no sense 😁 First computer was an IBM 3270 workstation with CGA monitor. Stuff: https://archive.org/details/@horun

Reply 3 of 5, by Confused UngaBunga

User metadata
Rank Newbie
Rank
Newbie

In general, for the same monitor model, ATi and 3Dfx had better VGA quality image, because nVidia authorised manufacturing for some less inclined to quality control?
There must have been some shady manufacturers for radeons as well right?

Reply 4 of 5, by Minutemanqvs

User metadata
Rank Member
Rank
Member

Matrox were known for their really good analog quality if you read old articles. And as others said, if you try using early DVI cards on modern high resolution display it often doesn’t work…it may display the BIOS but fail as soon as a driver is loaded or things like that.

Searching a Nexgen Nx586 with FPU, PM me if you have one. I have some Athlon MP systems and cookies.

Reply 5 of 5, by Horun

User metadata
Rank l33t++
Rank
l33t++
Confused UngaBunga wrote on 2023-05-14, 18:41:

In general, for the same monitor model, ATi and 3Dfx had better VGA quality image, because nVidia authorised manufacturing for some less inclined to quality control?
There must have been some shady manufacturers for radeons as well right?

You could be correct. Though if you think about the signaling: PC = digital, to Video card = converts to Analog (for DB15 VGA) which is usually controlled by the RAMDAC, use cheap version and cheap filtering and poor VGA.
For DVI then it carries Analog and digital (except DVI-D which is pure digital) which should be cleaner for monitors with DVI input because less conversion.
Also in the monitor if CRT can keep it analog and drive the montor thru it's circuits, which if cheap can make a good vid card look poor....
on other hand to a LED/LCD then the monitor DB15 must be convertered back to digital to drive the display. It all get a bit complicated 😀

Confused UngaBunga wrote on 2023-05-14, 18:41:

In general, for the same monitor model, ATi and 3Dfx had better VGA quality image, because nVidia authorised manufacturing for some less inclined to quality control?
There must have been some shady manufacturers for radeons as well right?

Agree !! Matrox was a master at making superb VGA outputs from their cards, is why back in the day they were sought after for CAD/graphic use on PC's.
And probably all the card manufactures cut corners here and there on certain releases or designs, or the oems did.....just thinking about all the possibilities where a vid card and monitor could be improved.
Good subject matter for the grey matter ;p edit fixed a boo-boo 🤣

Hate posting a reply and then have to edit it because it made no sense 😁 First computer was an IBM 3270 workstation with CGA monitor. Stuff: https://archive.org/details/@horun