VOGONS


First post, by BeginnerGuy

User metadata
Rank Oldbie
Rank
Oldbie

Hiya, I need to upgrade the video card in my Coppermine 800mhz rig as I'm needing the monitor duplication feature to pass over to my capture card.

I have a Geforce 4 Ti (NV25) and I find the performance pleasant enough. Morrowind would be a fun addition to the rig but I think the CPU isn't up to snuff to run that nicely.

Anyway, I'm curious if I should bother shooting for a higher end 5000 series card or if it will just get bottlenecked by the CPU. I'm always looking for the cheapest possible parts, so I don't want to buy anything that won't bring additional benefit. At minimum it needs to match the current card in performance while being able to duplicate monitors over 2 DVI ports.

Thoughts?

Sup. I like computers. Are you a computer?

Reply 1 of 28, by squiggly

User metadata
Rank Member
Rank
Member

Do not use a FX series card for anything before DX8, as the drivers are too new and do not cope. Oh, and for anything after DX8 as the FX series had notorious DX9 performance.

A high end 59xx for DX8 with AA+AF...yes although you would definitely want something better than a coppermine. I hook a 5900XT up to a Pentium 4 3.2ghz for example, and it performs very, very nicely for the DX8 era games.

Reply 2 of 28, by BeginnerGuy

User metadata
Rank Oldbie
Rank
Oldbie

Thanks, I have no reason to upgrade beyond the coppermine as the rig is perfectly fine where it is. I use a core 2 rig for anything DX 9 and beyond.

Unfortunately, I do run many pre DX8 titles with it. So now I'm thinking maybe I should just look for a DVI splitter and call it a day? I don't need extended monitors, simply a duplicated image out to the capture card in another PC.

Sup. I like computers. Are you a computer?

Reply 3 of 28, by squiggly

User metadata
Rank Member
Rank
Member
BeginnerGuy wrote:

Thanks, I have no reason to upgrade beyond the coppermine as the rig is perfectly fine where it is. I use a core 2 rig for anything DX 9 and beyond.

Unfortunately, I do run many pre DX8 titles with it. So now I'm thinking maybe I should just look for a DVI splitter and call it a day? I don't need extended monitors, simply a duplicated image out to the capture card in another PC.

I also use C2D for DX9.

I haven't used a splitter cable for DVI, but I know they degrade VGA signals. I have a powered external splitter box for VGA. I think you can also get capture boxes that plug into another computer via USB3 but will pass through the DVI siginal to another DVI output. All depends on how much $$$ you want to spend.

Reply 4 of 28, by swaaye

User metadata
Rank l33t++
Rank
l33t++

The FX series actually works quite well with many old games. The cards even have support for palettized textures and table-based fog, which are mostly DirectX 3-5 things inspired by Voodoo cards.

Unless you get a 5900 / 5950 Ultra, the speed isn't going to be much different than a GF4 Ti though. They will probably run slower in general because they have more driver CPU overhead and that P3 will feel it.

Reply 5 of 28, by ProfessorProfessorson

User metadata
Rank Member
Rank
Member
squiggly wrote:

Do not use a FX series card for anything before DX8, as the drivers are too new and do not cope.

Thats not even a remotely correct statement. The FX line handles Direct X 6 and 7 perfectly fine. The issue you are going to have more then anything is trying to power like the FX 5800 and 5900 cards on older setups with modern power supplies, since you wont have that much power on your +5 and +3 volt rails to power everything, maybe 120 watts depending on your psu.

Reply 7 of 28, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

The issue you are going to have more then anything is trying to power like the FX 5800 and 5900 cards on older setups with modern power supplies

I thought main source of power for FX 5900 series was 12v rail.

Last edited by The Serpent Rider on 2018-05-14, 03:02. Edited 1 time in total.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 8 of 28, by BeginnerGuy

User metadata
Rank Oldbie
Rank
Oldbie

I'm still curious which card would pair best with a coppermine though. I expect it would significantly bottleneck the 5900 as it is. I havent found much in the way of benchmarks.

i have to check the rails on the power supply, it's about 12 years old but most likely heavy on the 12v as standard today

Sup. I like computers. Are you a computer?

Reply 9 of 28, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
BeginnerGuy wrote:

I'm still curious which card would pair best with a coppermine though. I expect it would significantly bottleneck the 5900 as it is. I havent found much in the way of benchmarks.

i have to check the rails on the power supply, it's about 12 years old but most likely heavy on the 12v as standard today

I'd use an FX 5200, Geforce 4MX or some other cheap card from a couple years after the CPU was made to get high performance without spending a bunch of money. If you want something more period-correct, you'll generally want a Voodoo 3, TNT2 (maybe not M64) or at most a Geforce 2 of some kind. Later cards will be significantly faster though and driver support shouldn't be much of a problem as long as you use older drivers.

Now for some blitting from the back buffer.

Reply 10 of 28, by ProfessorProfessorson

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote:

The issue you are going to have more then anything is trying to power like the FX 5800 and 5900 cards on older setups with modern power supplies

I thought main source of power for FX 5900 series was 12v rail.

Sadly no, and Id really like to know who started that misconception. I mean, for one, the molex carries both 12v and 5v rails. Two, I have yet to see anyone reference Nvidia specifically stating the AGP FX and 6xxx cards are powered off the 12v rail only, because it never happened. Same as all others, they pull most of their power source from the AGP slot, same as any other AGP card that also pulls power from a molex. The molex just fills in the gap that the AGP slot cant fill when the card is at load because the AGP port caps out at about 49 watts unless you mod it. The FX came from a time when Antec 350 watt power supplies were considered hefty as is, and something like the TrueControl 550 watt was considered excessive and mainly for power users overclocking Nforce 2 type setups. Back then 12v rails on 300-350 watt power supplies will still fairly weak, usually 12-16 amps, and the majority of amps were on +5 and +3, and the FX 5900 was capable of pulling 60 watts at "moderate" load, if you didn't overclock.

The 12 volt rail was usually intended to power mechanical parts still mainly at that point, as in fans, hard drive, and cd/dvd rom drives, and the motherboard partially if you were running Nforce 2 or Pentium 4 machines. If the FX was powered solely, or even mostly, off the 12volt rail that would not have left much wiggle room to run hard drives, fans, dvd drom drives, and Nforce 2 motherboards with Athlon XP 2500-3200+ processors. In my experience, I can tell you this. I have tried my prior Gainward FX 5900, a couple AGP 6800 with the pipes unlocked, and a Radeon 9800XT in Athlon Thunderbird 1.2ghz and XP VIA type builds before on modern supplies, and for my efforts I got whine coming from the power supplies, or system shut downs. And I cant use my current stock AGP GF 6800 in my current P3 system because once it hits load on a few loops of 3D Mark 2003 or Aquamark I get a system shutdown. I can pull off using a Radeon 9700 though, but I have this gut feeling that I am barely getting by there every time I stick it in there.

Last edited by ProfessorProfessorson on 2018-05-14, 04:42. Edited 1 time in total.

Reply 11 of 28, by squiggly

User metadata
Rank Member
Rank
Member
Ozzuneoj wrote:
BeginnerGuy wrote:

I'm still curious which card would pair best with a coppermine though. I expect it would significantly bottleneck the 5900 as it is. I havent found much in the way of benchmarks.

i have to check the rails on the power supply, it's about 12 years old but most likely heavy on the 12v as standard today

I'd use an FX 5200, Geforce 4MX or some other cheap card from a couple years after the CPU was made to get high performance without spending a bunch of money. If you want something more period-correct, you'll generally want a Voodoo 3, TNT2 (maybe not M64) or at most a Geforce 2 of some kind. Later cards will be significantly faster though and driver support shouldn't be much of a problem as long as you use older drivers.

Driver support is a huge problem, as newer cards are not supported by older drivers. It's the main reason you want to avoid the FX series in particular for DX6/7 games, even a plain of GF2 will smash them using the older drivers. Even the GF4 is limited by being unable to use older drivers. The absolute premium cards would be a V5-5500 or a Geforce3, but as they are expensive/rare a Voodoo3 or Geforce2 GTS make affordable and excellent alternatives.

Reply 12 of 28, by ProfessorProfessorson

User metadata
Rank Member
Rank
Member
squiggly wrote:

Driver support is a huge problem, as newer cards are not supported by older drivers. It's the main reason you want to avoid the FX series in particular for DX6/7 games, even a plain of GF2 will smash them using the older drivers. Even the GF4 is limited by being unable to use older drivers. The absolute premium cards would be a V5-5500 or a Geforce3, but as they are expensive/rare a Voodoo3 or Geforce2 GTS make affordable and excellent alternatives.

Would be really cool if you stopped spreading false info like that. FX cards do not have problems running DX6 and DX 7 titles. You simply need to be running the proper drivers. I have a massive library of games and I have yet to have any issues with any of them on my FX cards unless the game was specifically optimized for Glide and DX support was tacked on, like in Dethkarz. I also have yet to find a game where my GF 2 Ultra or GF 3 Ti 200 is faster then my FX 5900. Yes, you may have to driver bounce for a random game here or there, because to be blunt, they broke stuff often back then during the Forceware releases because they were more worried about games just being released then games that had been out for two to three years. But youd have to do the same on a GF 2 or GF 3 also until you find your sweet spot driver, like 45.23 or something.

The key is to not use the last few driver releases made for your card, unless there was hardly any made to begin with, like the Fury MAXX or Kyro II. What you do is look at what card you plan to run, decide what year game wise you plan to have as a cut off, like lets say no games made and released past 2002. Then pick the driver set that works best that has a publishing date that falls within a year of that game release time frame. You take that driver and sit on it as your main driver. Then you find one of the early ones that supported your card that works well with any old problem games. Not the very first release that supports the card, but something from the first 7 months or so, and keep that as your secondary. Also try to stick with WHQL sets. Nvidia released tons of beta drivers back then like it was going out of style to fix stuff on the fly as opposed to just focusing on a solid driver release every two to three months.

Also, and dont get me wrong, I love my 3DFX cards, but no one should consider the Voodoo 3, 4, or 5 as the premium card to have unless most of what they plan to run is Glide releated. Direct X support was not as strong as Glide support was, and you honestly wont be having an awesome time trying to play through RTCW, Medal of Honor Allied Assault, Max Payne, or Serious Sam First Encounter on a Voodoo 5 with the low framerates and visual issues. If like you only care about Direct X 6 games and prior, then the Voodoo cards will have you covered easily though. Once DX7 hit, support got to be pretty meh, as did the performance in Direct X 7 titles that didnt support Glide too.

Also, why are you leaving out Geforce 4 cards? The MX440 and TI 4200 are awesome cards that are not too hard on modern power supplies. They run DX 6,7, and 8 games just as good as the GF 2 and 3 line did.

Reply 13 of 28, by F2bnp

User metadata
Rank l33t
Rank
l33t
squiggly wrote:

Driver support is a huge problem, as newer cards are not supported by older drivers. It's the main reason you want to avoid the FX series in particular for DX6/7 games, even a plain of GF2 will smash them using the older drivers. Even the GF4 is limited by being unable to use older drivers. The absolute premium cards would be a V5-5500 or a Geforce3, but as they are expensive/rare a Voodoo3 or Geforce2 GTS make affordable and excellent alternatives.

Sounds like someone watched that one Phil video, but didn't really understand it 😉 .

Reply 15 of 28, by F2bnp

User metadata
Rank l33t
Rank
l33t
squiggly wrote:

Sounds more like a few people here overpaid for some FX5900s 😉

https://www.youtube.com/watch?v=tr5EvZDjUFY

Seriously though, why do you keep saying this? Do you have any specific example of games that will not run on FX?

I'd personally avoid anything other than an FX 5900 and maybe even FX 5700, although that and the FX 5950 are later releases and do indeed require newer drivers. The older cards are just shite for the most part, you could make a case that the FX 5600 has the upper hand compared to a Ti 4200 when you enable AF and/or AA, but in most cases it's just too slow to matter or in the games where you could do it, the Ti 4200 is fast enough.
If you use an FX 5900 though you get a pretty nice card for older titles and you can still use the 45.23 drivers which is really the point of no return, as later drivers tend to break compatibility. The most popular title that I think has issues with later driver releases is probably No One Lives Forever.

Reply 16 of 28, by ProfessorProfessorson

User metadata
Rank Member
Rank
Member

If you go back and read his earliest post from when he joined you'll see he is a newbie back then, and to most all of this still is even now. His FX comments are not based off of anything factual, let alone his own actual experience. He is just regurgitating other peoples FX hate he had picked up on prior and took as literal fact I guess.

Reply 18 of 28, by shamino

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote:

The issue you are going to have more then anything is trying to power like the FX 5800 and 5900 cards on older setups with modern power supplies

I thought main source of power for FX 5900 series was 12v rail.

https://web.archive.org/web/20110920073015/ht … s-nv-power.html

They only look at the Ultra versions, but interestingly the FX5950 Ultra has a more even draw between +5V and +12V, but the FX5900 draws much more from the +12V rail.
I'm having trouble getting the charts to load from that archive.org page, so I'll attach them here separately. Also attaching the charts for an FX5700 (DDR) and FX5700 Ultra (DDR2). The card models are noted in the upper right corner of the chart.

Attachments

  • Filename
    fx5700_DDR_table-b.gif
    File size
    9.55 KiB
    Downloads
    No downloads
    File comment
    FX5700 DDR
    File license
    Fair use/fair dealing exception
  • fx5700u_DDR2_table-b.gif
    Filename
    fx5700u_DDR2_table-b.gif
    File size
    12.22 KiB
    Views
    1884 views
    File comment
    FX5700 Ultra DDR2
    File license
    Fair use/fair dealing exception
  • fx5950u_table-b.gif
    Filename
    fx5950u_table-b.gif
    File size
    12.22 KiB
    Views
    1887 views
    File comment
    FX5950 Ultra
    File license
    Fair use/fair dealing exception
  • fx5900u_table-b.gif
    Filename
    fx5900u_table-b.gif
    File size
    12.12 KiB
    Views
    1887 views
    File comment
    FX5900 Ultra
    File license
    Fair use/fair dealing exception

Reply 19 of 28, by ProfessorProfessorson

User metadata
Rank Member
Rank
Member

Thats a sweet article. However, the problem with their testing is that they are using it on a more modern specification for ATX and AGP using 8x slot, on a Athlon 64 board that relies heavily on 12 Volt to power everything on the motherboard, knocking it down when needed, so you cant totally trust whether or not your 3.3 and 5v leading to certain components is even coming off the psu +5/+3 rails, or if its being knocked down from the +12 on the motherboard. This was a time when AGP standards dropped dropped from relying heavy on putting power towards the 3.3V pins in the slot and opted to swap over to doing 1.5V and to 0.8V. So +12v dependence is heavy there and the system in that test was built with that in mind, with a power supply built to that ATX requirement.

So while that chart will apply greatly for something from that generation, it wont help so much for older Socket A, 370, and Slot boards that dont act that way since it wont be as cut and dry for them, which is why you end up with a lot of modern supplies getting bent out of shape with some FX and 6800 cards paired with old boards. What is needed is someone willing to take a wide selection of older boards, and also a wider selection of the FX cards other then just a few, because manufactures didnt always stick with stock designs. That chart they did kind is kind of scary when they are having one card, the weaker FX 5900 ultra, pull much heavier on the 12volt rail then the 5950 one does.

EDIT cause I hate double posting: Also if you look at the testing they do, they find the Galaxy 6800 pulling more amps off the AGP slot then the 6800 Ultra. This seems to fall in line with my current Apogee 6800 AGP, which hates to work with my EVGA 400 and 500 watt supplies in my older Via Slot 1 build in my closet, but does fine with a older PowerMan 350 watt in there. I think that board is a Trinity or Apollo Pro, I forget which because I have not messed with it in a tad over a year. Unlike my Intel brand 440 BX2 boards, It supports 4x AGP though if anyone cares to know. The only way I found I can get some stability on my EVGA supplies with that card and board is if I pull a hard drive and two of my 256mb ram sticks out of it. Its a pain to mess with that board in general though. I really did not like the Via support for 370 and Slot 1 based on their chips. Driver installation is more hands on then with Intel and you have to tinker a lot.

Also, though Im not willing to try it at the moment cause Id have to dig the thing out of my closet, I wonder if pulling my AWE64 out of it would help things. Not really sure how much power it uses, but I imagine its somewhat more then what the Aureal Vortex and SB Live does. And per the above, that brings you to another issue, 440 BX2 being 2x AGP, you cant even use the stock reference design GF 6800 on them because those cards wont support that older AGP slot volt wise. And I mean physically you cant even stick one in with without modding the motherboards slot to remove the notch anyway.

Found better links to that article you referenced, and also one to their ATI Radeon one.

https://web.archive.org/web/20040914084333/ht … s-nv-power.html

https://web.archive.org/web/20110716224645/ht … -powercons.html