VOGONS


Ati Radeon X1950 PRO AGP problems!

Topic actions

First post, by nd22

User metadata
Rank Oldbie
Rank
Oldbie

Hello everybody! I acquired a Sapphire Radeon X1950 PRO AGP 512mb recently and installed the card in several systems in order to test it including:
1. socket 462 - Barton 3200, 2*1gb, Abit AN7
2. socket 462 - Barton 3200, 2*1gb, Abit KW7
3. socket 754 - Clawhammer 3700, 2*1gb, Abit KV8 PRO
4. socket 754 - Clawhammer 3700, 2*1gb, Abit NF8 PRO
In all systems the driver 10.2 installs just fine but after the reboot I am greeted with 640*480 8 bit display and any attempt to run a 3d application such as 3dmark 2001 will lead to a freeze! The seller is positive that the card is 100% OK and insists that the problem is on my part. I did searched around and found there is a hotfix available for late Radeon AGP cards but I could not find it! Should I use an older driver version? Does anybody has that particular hotfix?

Reply 1 of 26, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

The card has typical BGA failure. AGP hotfix was made for Radeon HD cards. You're free to try this collection though: http://radeon.ru/drivers/amd/xp

The seller is positive that the card is 100%

Unless he documented that the card works fine in 3D, might as well claim that he's king of Mars.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 2 of 26, by winuser_pl

User metadata
Rank Member
Rank
Member

Also worth to mention, these CPUs are underpowered for this GPU. You'd most likely not get 100% gpu utilization in games. This Clawhammer is the closest to optimal.

PC1: Highscreen => FIC PA-2005, 64 MB EDO RAM, Pentium MMX 200, S3 Virge + Voodoo 2 8 MB
PC2: AOpen => GA-586SG, 512 MB SDRAM, AMD K6-2 400 MHz, Geforce 2 MX 400

Reply 3 of 26, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie

I've seen a lot of ati cards failing because of the bridge chip, it gets very hot and doesn't have any cooling, nvidia cards always had a heat sink on the bridge chip, but for some reason I've never seen an ati card with a heat sink on it.
Two weeks ago my x1650 pro agp, died and it was the bridge chip, I also have an hd2400 pro agp, that had the same problem and a hd3450 agp.
Usually reflowing then fix the problem, at least the first time. But I've read somewhere that the bridge chips had different versions so, maybe not al of then die, but if the chip gets very hot, it will die, is a mather of time.

Reply 4 of 26, by nd22

User metadata
Rank Oldbie
Rank
Oldbie

I installed driver version 9.11 from the link provided by The Serpent Rider and now it passes the first test in 3dmark 2003 but locks up in the second!

Reply 5 of 26, by nd22

User metadata
Rank Oldbie
Rank
Oldbie

I installed omega drivers drivers 7.12 found on Phil's website and tried 3dmark 2001 - freeze in the first test! Unless anyone here has some advice i am going to send the card back!

Reply 6 of 26, by swaaye

User metadata
Rank l33t++
Rank
l33t++

The ATI bridged AGP cards were super troublesome.

Reply 7 of 26, by nd22

User metadata
Rank Oldbie
Rank
Oldbie

Epilogue of the story:
Yesterday I went back to the seller with the card and let him test it in one of his systems: the exact problem that I encountered was reproduced - freeze in any 3dmark, Half life 2, Doom 3 and so on. He took the card back and gave me a geforce 7900 gs 512mb agp from Gainward! Upon returning home I installed it in the Clawhammer system with the Abit KV8 pro board. 175.19 driver installed fine and after returning a score of around 18000 in 3dmark 2001 I spent the evening playing Doom 3 ROE at 1600*1200 max settings with AA 2X enjoying one more time one of my favorites games!
My conclusion is that nvidia cards with bridge chips are more stable and more viable in the long term.

Reply 8 of 26, by aaronkatrini

User metadata
Rank Member
Rank
Member

I had a similar experience with a Radeon 4650 AGP. The driver would install fine, but after reboot no increased resolution, and the card wasn't properly detected in Device Manager. I was selling the card as broken/partially working, but then someone told me about this Ati AGP HotFix driver, installed and the card worked fine after that.

There are a few versions of this driver, one of them can be found here:
https://drivers.softpedia.com/get/GRAPHICS-BO … ix-for-XP.shtml

If I'm not mistaken though, these HotFix driver are only for 3xxx and 4xxx series of cards, so I'm not sure it would on X1950 Pro...

Too bad you hadn't a chance to try 12.3 or 12.4 hotfix 🙁

Reply 9 of 26, by nd22

User metadata
Rank Oldbie
Rank
Oldbie

I even installed 7.1 drivers that were used by review sites when this card as released and no luck - immediate freeze in every single test!

Reply 10 of 26, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I wouldn't be surprised if ATI's drivers are the cause of all the problems with the AGP bridge cards. They were not well supported and there occasionally were regressions that broke stuff. The QA testing didn't seem to be there for the AGP cards after 2005 or so.

Reply 11 of 26, by nd22

User metadata
Rank Oldbie
Rank
Oldbie

You are probably right because from my experience I noticed that every single NVIDIA card works right out of the box with every single driver version even if some performance is lost being much newer whereas ATI cards, especially AGP ones, are very picky with the drivers. Today using modern systems with modern AMD cards such as RX580 I did not noticed a problem with Radeon drivers but with old ones - Radeon 8500 - HD4890 - drivers can be a problem, so the rumour that ATI had bad drivers may actually be true!
Now for the experience with the NVIDIA card: it is actually the model Gainward 7800GS GLH with geforce 7900GS that has 20 pixel shaders and 7 vertex shaders.There is a faster card with the 7900GT core named 7800GS+ with 24PS and 8VS but I do not have that one. Performance is incredible: Doom 3 runs at 1600*1200 AA-2X at 60 FPS locked, a feat not possible with the regular 7800GS with which I get dips to 30 FPS in certain areas. Half life 2 original release from 2004 also runs at 60 FPS with AA-4X. Having 512mb of memory onboard helps a lot as MSI afterburner reports that all of it is used in Doom 3. Temperatures are really good: never exceeded 60 degrees. The processor - clawhammer 3700 - bottlenecks the card at 1280*1024 but not at 1600*1200 where both the CPU and the GPU are stressed to the max. I will test in the future the card in a socket 939 system based around Athlon 64 4800 and Abit AV8 but until then is back to Doom 3 😀!

Reply 12 of 26, by swaaye

User metadata
Rank l33t++
Rank
l33t++

It sounds like the regular 7800 GS is essentially sort of a beefed up 6600 GT. Still not a weak card for Doom 3, but 1600x1200 is demanding a lot of fillrate.

Reply 13 of 26, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

7800GS is essentially just a renewed GeForce 6800GT/Ultra. Practically identical in performance, due to hardware configuration and clocks.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 14 of 26, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

7800 GS AGP is just plain weird card :

The attachment 3DMark 01 SE (fillrate).png is no longer available

^Athlon64 x2 6400+/nForce3 + Windows XP SP3, NV driver : 307.83/ATI driver : Cat 10.2

Last edited by agent_x007 on 2021-02-19, 21:47. Edited 1 time in total.

157143230295.png

Reply 15 of 26, by swaaye

User metadata
Rank l33t++
Rank
l33t++
The Serpent Rider wrote on 2021-02-19, 20:27:

7800GS is essentially just a renewed GeForce 6800GT/Ultra. Practically identical in performance, due to hardware configuration and clocks.

It has only 8 ROPs. But you are right that it is closer to those than I originally thought. I was thinking 6600GT is 16/8 when it is actually 8/4, but then 6600GT is at 500 MHz instead of 375 MHz. It's been awhile since I pondered these cards...

https://www.anandtech.com/show/1937/3

Reply 16 of 26, by swaaye

User metadata
Rank l33t++
Rank
l33t++
agent_x007 wrote on 2021-02-19, 21:07:

7800 GS AGP is just plain weird card :
3DMark 01 SE (fillrate).png
^Phenom II/nForce3 + Windows XP SP3, NV driver : 307.83/ATI driver : Cat 10.2

500 MHz core? Is that the stock speed for your card?

Reply 17 of 26, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote on 2021-02-19, 21:28:

500 MHz core? Is that the stock speed for your card?

No, "stock" on my card is 440MHz (I own XFX model).
I OC'ed it a bit for my test. Here's a bit more complex Fillrate Benchmark 2004 comparison :

The attachment Fillrate benchmark.png is no longer available

And what GPU-z/AIDA64 think about it :

The attachment AIDA64.PNG is no longer available

157143230295.png

Reply 18 of 26, by nd22

User metadata
Rank Oldbie
Rank
Oldbie

6600GT does not stand a chance in Doom 3 at 1600*1200 ultra settings dipping to the low 20s frequently! I know because I tested quite a few nvidia cards in Doom 3:
1. geforce 5700 ultra - abysmal performance even at 800*600. I suspect all FX series suffers greatly in directx 9.0 titles.
2. geforce 6600 gt - I got Leadtek anniversary edition that has the "correct" GPU/memory clocks - 500/1000 however only 128mb of ram which is inadequate in this game. It runs OK at ultra settings 1024*768 but not 60 fps locked!
3. geforce 6800 ultra - first card that can run Doom 3 at 1600*1200 without dropping below 30 fps. AA is out of the question and the experience can get pretty choppy in RoE when you have to fight multiple enemies at once at 30 fps!
4. geforce 7800 gs - from Leadtek, first card that can run Doom 3 at 1600*1200 ultra settings with mostly 60 fps. The experience is slightly better then with 6800 ultra but not quite there!
5. geforce 7900 gs - from Gainward, this one has 512mb of ram and it really helps! First card that can use AA 2X without severe performance drops; without AA 60 fps locked at 1600*1200 ultra settings. Like all geforce 6 and 7 series AA is the Achilles kneel - 16X AA brings performance down to the 20s!

Reply 19 of 26, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

It has only 8 ROPs.

Nope.

but then 6600GT is at 500 MHz instead of 375 MHz.

And? 6800GT is 350 Mhz by default.

I must be some kind of standard: the anonymous gangbanger of the 21st century.