VOGONS


What would be the best graphics card in 2005?

Topic actions

First post, by 89ermis

User metadata
Rank Newbie
Rank
Newbie

Hello everyone, I want to build a computer that would have been the best available in 2005.
I already have a CRT monitor with S-video so that's the only prerequisite.
I mainly want to be able to play early to mid 2000s titles such as
Tomb Raider series
Civilization III
NFS porsche
etc
I've seen some people online recommend 7800GTX or the X1900 XT. Are these good options and what CPU would you recommend for them?
Again, my main requirement is to be able to play these old games hassle-free. This is my first time considering building a retro gaming rig, so if you have any other advice it is more than welcome.

Reply 1 of 29, by MMaximus

User metadata
Rank Oldbie
Rank
Oldbie

In my experience S-video has always been subpar - If you want optimal quality for the era you would need a VGA connection. S-video was just a way for graphic cards manufacturers to allow the use of a CRT TV as a secondary screen, for example to watch movies - but I don't think it ever got widespread.

Hard Disk Sounds

Reply 2 of 29, by 89ermis

User metadata
Rank Newbie
Rank
Newbie

Well the reasoning behind this is that I already have a CRT TV with this type of connection available (along with rgb scart), so I wouldn't have to search for a PC CRT monitor which is kind of troublesome to find here in Greece.

Reply 3 of 29, by auron

User metadata
Rank Oldbie
Rank
Oldbie

well, 7800 gtx or 1800 xt would be the best from 2005. but you'd also have to find a VIVO cable, which may not be that easy to come by now.

if you are really set on running this through s-video, where you could render at 640x480, you could totally get by with a 6600 gt, or even a 9600 pro or something. certainly with the types of games you mentioned. also, look into soft15khz and what cards are compatible with that, because with the right cable you could use the RGB input on your TV. the other option is to transcode component, which is probably way out there in terms of cost though.

Reply 4 of 29, by swaaye

User metadata
Rank l33t++
Rank
l33t++

ATI was usually better on the composite/S-Video output quality. They used their own TV encoder chip line called Theater.

Their X850 from 2005 is nice but it is a Shader Model 2 card so you run into unsupported games by 2006. Still, ATI had better quality filtering and anti-aliasing until Nvidia released 8800.

Reply 5 of 29, by elszgensa

User metadata
Rank Member
Rank
Member

Outputting S-Video to a consumer TV tops out at roughly [the analog equivalent of] a low low 640x480 for NTSC or slightly above that for PAL, so if that's what you intend to go with then suddenly the choice of video card doesn't matter all that much since that's about potato territory.

Don't skimp on the video cable, you'll probably gain more quality from getting a slightly expensive, but decent one than if you spent it on a better card and shit chinesium wiring.

Last edited by elszgensa on 2024-12-17, 23:50. Edited 1 time in total.

Reply 6 of 29, by DrAnthony

User metadata
Rank Member
Rank
Member

I think the wrinkle here is that the x1800 suffered significant delays and didn't launch until October. The X1900 launched like a touch over 2 months later, right around new years day I think. Technically the x1900 isn't a 2005 card, but I hard time separating the release schedule of the 2. That and the X1800 might be a lot harder to find since it barely even existed. The GeForce 7900 GTX launched in like what, March that year and represents something much more widely available during that time period. I think any of the enthusiast class cards from either brand would fit the bill here, personally I'd break the rules juuuuuuuust a smidge and run a 1900XT(X). The XTX was basically within the margin of error performance wise against the XT so I wouldn't bend over backwards for one.

Reply 7 of 29, by auron

User metadata
Rank Oldbie
Rank
Oldbie
elszgensa wrote on 2024-12-17, 23:45:

Outputting S-Video to a consumer TV tops out at roughly [the analog equivalent of] a low low 640x480 for NTSC or slightly above that for PAL, so if that's what you intend to go with then suddenly the choice of video card doesn't matter all that much since that's about potato territory.

in theory, rendering at a higher resolution should give downsampling on the TV out. i've never used this, but clone mode is mentioned regarding it, so at least some of those cards should work like that. but how appreciable that would be depends on the TV size, quality and type of connection used.

Reply 8 of 29, by leileilol

User metadata
Rank l33t++
Rank
l33t++

x850's fine if you don't mind not playing some of the Ubisoft titles that required SM3_0 asap. (no big loss, the card can still work with Crysis and UT3)

apsosig.png
long live PCem

Reply 9 of 29, by rasz_pl

User metadata
Rank l33t
Rank
l33t
auron wrote on 2024-12-18, 01:53:

in theory, rendering at a higher resolution should give downsampling on the TV out.

On LCD yes, but CRT filters things on its own even from perfect full BW NTSC unless you have one of those high TVL count TV production monitors.

https://github.com/raszpl/FIC-486-GAC-2-Cache-Module for AT&T Globalyst
https://github.com/raszpl/386RC-16 memory board
https://github.com/raszpl/440BX Reference Design adapted to Kicad
https://github.com/raszpl/Zenith_ZBIOS MFM-300 Monitor

Reply 10 of 29, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

When I last had CRT TV next to PC in 2003ish, doing TV out from my GF3 on an Athlon XP system. I preferred the 800x600 in game, any res was hell for text and desktop though, so best to have even the worst 14" LCD for admin and management stuff rather than expecting to rely solely on TV.

A major problem with finding good TV out capabilities in a card is that it varied by specific card model per specific card manufacturer. So you can't just search the GPU and get a list of ones that will work... gotta eyeball them for the socket, ask the seller if they have the possibly unique dongle for it. There are pretty much no good reviews around for video out functionality for any "normal" cards, though some specially targeted home theater cards with all the bells and whistles did get a review or two that did. The reason for this appears to be that cards were always reviewed as preview models, with preview drivers which were "just the basics" and did not include the Video Out components/utilities necessary for these cards at the time they were reviewed. They were added to retail drivers. Also it appears that the right drivers may only be in driver packages from a year or two of release of the specific card and you may need manufacturer specific drives. So you need 70 series detonators for your video out to work, but oh no, maybe they were broken for a later game, and an 80 or later release fixed it, but guess what, no TV out drivers.... it's a mess.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 11 of 29, by 89ermis

User metadata
Rank Newbie
Rank
Newbie

You guys think I should try and find a PC CRT for better results with VGA?

Reply 12 of 29, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++
89ermis wrote on 2024-12-18, 07:40:

You guys think I should try and find a PC CRT for better results with VGA?

Consumer CRT TVs from that time just weren't sharp enough for PC gaming. They were ok for watching DVDs via TV-out, but lacked the clarity needed for actual gaming.

If you have a hard time finding a PC CRT monitor, consider a modern 1920x1200 business oriented LCD, like the Asus ProArt PA248QV. It can display 1600x1200 natively (with black bars on the side) which was the premier gaming resolution during the mid 2000s.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 14 of 29, by rasz_pl

User metadata
Rank l33t
Rank
l33t
Joseph_Joestar wrote on 2024-12-18, 08:16:

Consumer CRT TVs from that time

Console gamers say whaaaa? Nothing wrong with playing in Standard Def as long as you are hitting that sweet locked Vsync.
What do you mean by 'that time'? OP didnt mention his TV model.

https://github.com/raszpl/FIC-486-GAC-2-Cache-Module for AT&T Globalyst
https://github.com/raszpl/386RC-16 memory board
https://github.com/raszpl/440BX Reference Design adapted to Kicad
https://github.com/raszpl/Zenith_ZBIOS MFM-300 Monitor

Reply 15 of 29, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++
rasz_pl wrote on 2024-12-18, 10:34:

Console gamers say whaaaa? Nothing wrong with playing in Standard Def as long as you are hitting that sweet locked Vsync.
What do you mean by 'that time'? OP didnt mention his TV model.

I have a 29" Sony Trinitron WEGA From 2004 or so. My console games (SNES, Sega MegaDrive, PlayStation 1&2) look amazing on it when using a RGB SCART connection.

But PC games via TV out (S-Video) are too blurry on it for my taste. Comparing it to my 17" Samsung Syncmaster 795MB, the difference is night and day.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 16 of 29, by auron

User metadata
Rank Oldbie
Rank
Oldbie

maybe some of these cards are using heavy filtering on the TV out when downsampling from non-integer multiples, which would be the usual case in practice. say, 1024x768 to 720x480, or whatever the resolution on that TV out is. running at the exact TV out resolution or an integer multiple of it should give better results in theory. another issue though is overscan, it needs to compensate for that by putting a black border around the actual viewing area.

Reply 17 of 29, by elszgensa

User metadata
Rank Member
Rank
Member
89ermis wrote on 2024-12-18, 07:40:

You guys think I should try and find a PC CRT for better results with VGA?

That'd be ideal, but failing that, you can likely still find older (not necessarily "2005 old", more like from ~ten years ago) LCD monitors (or HD(-Ready) TVs) with VGA in, or convert to something a modern screen will accept. S-Video to a SD CRT TV as you sole PC display will be a pretty miserable experience.

auron wrote on 2024-12-18, 11:15:

overscan, it needs to compensate for that by putting a black border around the actual viewing area.

Not necessarily, you can also take care to not just put any relevant info (GUI elements etc) in there. Many console games do it this way (and I'd argue it's the more reasonable thing to do - wouldn't want unnecessary black borders on those TVs with a slimmer overscan area. And they varied a lot.) PC games, however, never paid attention to that - never had to.

Reply 18 of 29, by auron

User metadata
Rank Oldbie
Rank
Oldbie

well, this was implied in my post - you're probably not going to rewrite the windows GUI, every game etc. to take overscan into account, so the black border it is.