VOGONS


Video card for Athlon XP?

Topic actions

Reply 20 of 50, by swaaye

User metadata
Rank l33t++
Rank
l33t++
obobskivich wrote:

IME the R300 cards had awful longevity (all of mine died years ago 😢), despite being decent performers when brand new. Some folks around here have speculated its because their coolers were usually the bare-minimum for what the chips actually needed (and indeed they're usually pretty flimsy looking). Personally I wouldn't go after another one that was designed to need fan cooling, because of that experience. I don't know if the cards you have may have been "victims" of a similar phenomenon or not.

I've had a few 9700s need underclocking to remain stable. Specifically there would be image artifacts and the RAM needed to be reduced in clock speed. I don't know what the cause of this would be. Could be the RAM had run too hot over the years, or the GPU memory controller, or perhaps even the PCB....

The cards that use paste instead of thermal wax also frequently have dried out paste. Definitely need to look for that. 9800 and 9600 typically use paste I think whereas 9700 and 9500 have wax TIM.

Reply 21 of 50, by meljor

User metadata
Rank Oldbie
Rank
Oldbie

I also noticed some time ago that for these R300 cards the only survivors were the ones that had an aftermarket cooler attached.

Same story for the x800 and x1900 series. Nvidia`s 6800 series can get defect also when not cooled good, 7800 series are even worse, don`t see that many for sale lately.

I use a 6800 ultra AGP but underclock it slightly just to make sure it will live longer, even though it has a better cooler than the 6800GT.

asus tx97-e, 233mmx, voodoo1, s3 virge ,sb16
asus p5a, k6-3+ @ 550mhz, voodoo2 12mb sli, gf2 gts, awe32
asus p3b-f, p3-700, voodoo3 3500TV agp, awe64
asus tusl2-c, p3-S 1,4ghz, voodoo5 5500, live!
asus a7n8x DL, barton cpu, 6800ultra, Voodoo3 pci, audigy1

Reply 22 of 50, by sunaiac

User metadata
Rank Oldbie
Rank
Oldbie
Mau1wurf1977 wrote:

Yea the 6800GT was a beast when it came out. It put Nvidia back into pole position after they struggled a little with the FX series.

No.

IMG0007763.gif

You have to wait for 8800 to have nVidia back on top.
Except if you choose your game subset really well, like only nVidia programmed engines like Doom 3.

R9 3900X/X470 Taichi/32GB 3600CL15/5700XT AE/Marantz PM7005
i7 980X/R9 290X/X-Fi titanium | FX-57/X1950XTX/Audigy 2ZS
Athlon 1000T Slot A/GeForce 3/AWE64G | K5 PR 200/ET6000/AWE32
Ppro 200 1M/Voodoo 3 2000/AWE 32 | iDX4 100/S3 864 VLB/SB16

Reply 23 of 50, by DosFreak

User metadata
Rank l33t++
Rank
l33t++

http://www.newegg.com/Product/Product.aspx?It … N82E16814161337

Works on my XP 2800+

How To Ask Questions The Smart Way
Make your games work offline

Reply 24 of 50, by AlphaWing

User metadata
Rank Oldbie
Rank
Oldbie

I have a 9700pro and a 9800pro with their reference sinks that both work fine still.
The 9800pro ran for years and years in the same machine that saw daily use, but it did have a 80mm Fan blowing directly down on it and the first few pci slots, and the tim was replaced with artic silver II back when I got it new. Its currently, not doing anything now tho, other then sitting in a drawer as that machine was replaced last year with a core duo and a x800.

That said the 9800pro's reference sink is still a piece of junk, if the fan were to fail on it and there is no secondary cooling you'd have a card that soon would fail. The 9700's tho is not that bad, in comparison. As its much bigger. Its sitting in a 3.7ghz P4 atm.

Reply 25 of 50, by shamino

User metadata
Rank l33t
Rank
l33t

I'll add my 9800 Pro to the list - last time I tried to use it, it was no longer working correctly. I tried reflowing some solder joints on some surface mounted caps, because I thought 1 or 2 had been cracked. At first I thought that fixed it but not long after it began glitching again.
It was part of what I felt was a perfectly tuned late 32-bit Athlon nForce2 system. Sad that the card died, it really breaks up the build, but apparently this has been common. If I find another I'll make a point to beef up the cooling as people here have suggested. It's annoying that gaming cards aren't built to last, but they just aren't.

A key difference between the 9800/etc cards and any of the later "X" cards is that the 9xxx family still draw most of their power from the 5V rail. I think the very next generation cards lean on the 12V rail. This makes a difference in what type of PSU specs you need to run them.

I've usually preferred nVidia, but that 9800 Pro worked so perfectly that I was planning to switch teams and buy an ATI HD5750/70 for my next PC. I only changed my mind because of technical problems with an interim 4350 - it was faster than the 9800 Pro but it had problems the old card never did. If the slots were compatible I would have switched back to the 98Pro.

Reply 26 of 50, by Unknown_K

User metadata
Rank Oldbie
Rank
Oldbie
meljor wrote:

I also noticed some time ago that for these R300 cards the only survivors were the ones that had an aftermarket cooler attached.

Same story for the x800 and x1900 series. Nvidia`s 6800 series can get defect also when not cooled good, 7800 series are even worse, don`t see that many for sale lately.

I use a 6800 ultra AGP but underclock it slightly just to make sure it will live longer, even though it has a better cooler than the 6800GT.

I just got in a 6800 GT AGP 256MB card and had to clean the heatsink from dust that clogged the flow path (common in those cards) so cooling would suck. Works fine in VGA havn't tested it yet in 3D. The heatsinks on the GT 6800's is much bigger then the original 6800 128MB model I also have.

Collector of old computers, hardware, and software

Reply 27 of 50, by Kahenraz

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

6800 supports older games better. It has better 16-bit color support and D3D5 fog table, for example.

I remember, now that you mention it. The FX series was also the last to support 8-bit palletized textures. I bought a passively cooler 5200 for my retro system for this reason.

Reply 28 of 50, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Check those 9800 cards for dried up thermal paste. I found that when I got a 256MB 9800 Pro awhile back.

Oh and the 9500/9700 cards use a solid TIM that changes to liquid when the GPU heats up. So don't mistake that stuff as dried up. That wax-like TIM is needed as a gap filler on the 9500/9700 because the GPU shim creates a small gap between the die and the heatsink (normal paste won't work). 9800 usually has an outset area on the heatsink to eliminate the gap.

Reply 29 of 50, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

The thing is my Radeon cards work, but I couldn't get a driver that worked under W98SE. The 9800 game me blue screen, the 9550 a crash to desktop. This is with the latest W98 drivers from the AMD website.

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 32 of 50, by Firtasik

User metadata
Rank Oldbie
Rank
Oldbie

My Radeon 9550 works fine with DVI. Can't say this about GeForce FX 5500 due to lack of 1920x1200 support and artifacts when using HDMI to DVI adapter.

11 1 111 11 1 1 1 1 1 11 1 1 111 1 111 1 1 1 1 111

Reply 33 of 50, by obobskivich

User metadata
Rank l33t
Rank
l33t

Regarding DVI compliance, from what I've read the issue can vary from specific model to model (that is, Asus Radeon XYZ vs Leadtek Radeon XYZ), especially in the case of chipset designs that rely on an external TMDS transmitter (which is true of many older cards). The Radeon 9 and GeForce FX, in my understanding, were the last generation of this being a serious concern, and the GeForce 6 and Radeon X should have considerably less issues. Alternately, professionally-oriented cards like the Quadro FX tended to fare better in this respect as they often had quality TMDS transmitters as part of their board designs. Some of the GeForce FX-era Quadro cards support dual-link DVI, if that's a need (I think they're among the few universal AGP cards with Windows 9x drivers to do so as well).

Reply 35 of 50, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Firtasik wrote:

My Radeon 9550 works fine with DVI. Can't say this about GeForce FX 5500 due to lack of 1920x1200 support and artifacts when using HDMI to DVI adapter.

Yeah I've seen that with 5200/5500. The one I had didn't support even 1680x1050 but I managed to create a CVT-RB custom resolution that worked.

The 9 series Radeons do have DVI problems as well though. I remember reading about it. There are some configuration tweaks in the control panel that people needed to try. It surely varies from card to card and from monitor to monitor. I ran a 1920x1200 DVI LCD on my 9700 alright though.

8500 and 7500 are really bad with DVI. It doesn't work until you hit Windows. Seems it can't output low resolutions correctly and the monitor just sleeps. I've seen flickers so the cards do seem to output something. Windows sometimes has artifacting too. These chips have integrated TDMS transmitters but apparently they are fubar.

Reply 36 of 50, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie

I've had DVI problems @ 1920x1200 not only with a 9800 Pro but also with an X800XT A-I-W, both built by ATI. My Powercolor Radeon 7000 PCI has no problem outputting 19x12 over DVI. It's slow as hell doing it, but that's a another story. 🤣

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 38 of 50, by shamino

User metadata
Rank l33t
Rank
l33t
boxpressed wrote:

For the 6800GT, which rail on the PS should I be concerned with? And what's a sufficient current for that rail?

This article did an impressive job of investigating this, and fortunately they covered the 6800GT:
http://www.xbitlabs.com/articles/graphics/dis … nv-power_3.html

The chart on that page shows measured current draws for each rail, with the slot and aux power connector measured separately.