VOGONS


The story of the QuForce FX 5800 Ultra...

Topic actions

Reply 100 of 147, by slivercr

User metadata
Rank Member
Rank
Member
mockingbird wrote on 2020-06-25, 15:34:
slivercr wrote on 2018-04-06, 09:32:

Thanks for the compliment! I am working on a Quadro FX 4000 and turning it into a GF 6800 Ultra right now. Besides modifying the straps for the identity, I also have to unlock an extra quad of pipelines 😉 Other cards that should be interesting are the Quadro4 980XGL and maybe the Quadro FX 1100.

Hey, did you have any luck with this? I have a 6800GS and I want to modify the straps to unlock the extra pixel pipelines and vertex shader pipelines through this method instead of using Rivatuner.

Thanks

Never finished.
Most of my cards were sold before I moved across the world and I haven't found any cheap Quadros around where I am now. If I ever get more, I'll play with this again.

As a sidenote, Rivatuner will yield the exact same result when in Windows. It may be less convenient if you plan to swap cards around and have to set nvstrap everytime, but the result is the same.

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 101 of 147, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie
slivercr wrote on 2020-06-25, 16:33:
Never finished. Most of my cards were sold before I moved across the world and I haven't found any cheap Quadros around where I […]
Show full quote
mockingbird wrote on 2020-06-25, 15:34:
slivercr wrote on 2018-04-06, 09:32:

Thanks for the compliment! I am working on a Quadro FX 4000 and turning it into a GF 6800 Ultra right now. Besides modifying the straps for the identity, I also have to unlock an extra quad of pipelines 😉 Other cards that should be interesting are the Quadro4 980XGL and maybe the Quadro FX 1100.

Hey, did you have any luck with this? I have a 6800GS and I want to modify the straps to unlock the extra pixel pipelines and vertex shader pipelines through this method instead of using Rivatuner.

Thanks

Never finished.
Most of my cards were sold before I moved across the world and I haven't found any cheap Quadros around where I am now. If I ever get more, I'll play with this again.

As a sidenote, Rivatuner will yield the exact same result when in Windows. It may be less convenient if you plan to swap cards around and have to set nvstrap everytime, but the result is the same.

By the way, I bought a Quadro 3000 AGP recently -- it's defective... Corruption on the Windows loading screen and then a system freeze after a few seconds in Windows. OTOH I have three Quadro 1000 cards that work perfectly fine. I think the 5800's clock and memory were pushed beyond spec, despite what the official spec claimed (in other words, they were factory overclocked cards). If you can find a fresh Quadro 3000, I suggest running it at Quadro 1000 speeds for longevity. The 2000 and 3000 layout are also preferable because they offer a much wider variety of cooling solutions. I may try memory IC swaps on it in the future to see if that's what's wrong with it. The NV3x chips should not fail, but I wouldn't be surprised if the Samsung memory has.

As for the 6800GS.. Besides the 6800GS, I've also got 3 6800GT cards... Theoretically, the only thing that would be different in the BIOS is the device ID strap and the pixel pipes and vertex shader pipes straps. Maybe I ought to dump both cards and examine the differences... Any idea on where to look?

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 102 of 147, by slivercr

User metadata
Rank Member
Rank
Member
mockingbird wrote on 2020-06-25, 20:15:

...
As for the 6800GS.. Besides the 6800GS, I've also got 3 6800GT cards... Theoretically, the only thing that would be different in the BIOS is the device ID strap and the pixel pipes and vertex shader pipes straps. Maybe I ought to dump both cards and examine the differences... Any idea on where to look?

bits 12, 13, 20, 21 of strap0 for device ID. No idea for the pipes.
You could compare the bits corresponding to the strap and rule out similarities/differences using the documentation to figure it out.

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 103 of 147, by havli

User metadata
Rank Oldbie
Rank
Oldbie
mockingbird wrote on 2020-06-25, 20:15:

As for the 6800GS..

Depending on the GPU, some 6800 GS may not unlock at all... those using NV41 and NV42 for instance. Only if you have NV40 or NV45, there is a chance of unlocking.

HW museum.cz - my collection of PC hardware

Reply 104 of 147, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie
slivercr wrote on 2020-06-25, 23:53:

bits 12, 13, 20, 21 of strap0 for device ID. No idea for the pipes.
You could compare the bits corresponding to the strap and rule out similarities/differences using the documentation to figure it out.

RivaTuner takes the guesswork out with a patch script called "NV40BIOSHwUnitsMaskEliminator.rts" which you apply to your BIOS dump. It also tells you what needs to be modified in the program if you want to do it by hand:

before:

rt1.png
Filename
rt1.png
File size
14.55 KiB
Views
2007 views
File license
Public domain

after:

rt2.png
Filename
rt2.png
File size
14.44 KiB
Views
2007 views
File license
Public domain
havli wrote on 2020-06-26, 17:08:

Depending on the GPU, some 6800 GS may not unlock at all... those using NV41 and NV42 for instance. Only if you have NV40 or NV45, there is a chance of unlocking.

All 6800GS AGP are NV40. Unfortunately, mine experiences graphical glitching and is not suitable for the mod. So not all 6800GS AGP will unlock. I think the score was around 10900 in 3DMark vs. the 6800GT.

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 105 of 147, by havli

User metadata
Rank Oldbie
Rank
Oldbie

Most of them perhaps, but not all. 😀 There is at least one NV41 based - http://hw-museum.cz/vga/311/gainward-geforce- … 0-gs-agp--nv41-

HW museum.cz - my collection of PC hardware

Reply 106 of 147, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie
havli wrote on 2020-07-01, 15:06:

Most of them perhaps, but not all. 😀 There is at least one NV41 based - http://hw-museum.cz/vga/311/gainward-geforce- … 0-gs-agp--nv41-

Wikipedia's labelling is a bit confusing, but it could be extrapolated that all PCIe chips (which includes cards that use a bridge chip) are on TSMC 110nm, while the AGP parts are IBM 130nm.

Bumpgate and underfill problems used to be a hot topic with nVidia back in the day, but I now wonder if the IBM 130nm chips were in fact much more reliable than TSMC 110nm parts. I trash picked a 6600GT yesterday. Cleaned her up, oiled the fan, it works great and it's a factory overclocked model (and it's AGP with the bridge chip). So it might be interesting to compare this card's longevity with the 6800 cards here. It's not an apples to apples comparison though, because the chip is different. A good test would be to get a native PCIe 6800 card and compare that, because it's much more similar.

To be fair, all previous chipsets were TSMC, but then the problems did in fact all start with the 90nm TSMC chipsets (as in the motherboard chipsets, which had an enormous failure rate - but it's also worth noting that HP overvolted them from the factory).

EDIT: Several hours into a 3dMark 2003 loop, the 6600GT crashed the system. So this lends credence to my theory that IBM's 130nm 6th generation GeForce parts were more reliable than TSMC fabbed ICs.

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 107 of 147, by 0xCats

User metadata
Rank Newbie
Rank
Newbie

Another addition for those wanting to mod GeForce 4 Ti 4200 cards into Ti 4800.

GeForce 4 Ti 4200 to Ti 4800 Mod

Original straps read out with

nvflash --display 112
E9 8F 71 00 43 10 8F 80 FF FF FF FF 00 00 00 00 
FF FF FF FF 00 00 00 00 00 00 00 08 E9 35 14 E9

Mask we are interested in

FF FF FF FF 00 00 00 00 
FF FF FF FF 00 00 00 00

Device ID's

GF4 Ti 4200 = 0281
GF4 Ti 4800 = 0280

What we need to modify:
HW Strap

ID-Bit:				  4 3210
0280 = 0000 0010 1000 0000
0281 = 0000 0010 1000 0001
Change = xxxx xxxx xxxx xxx0

So we want to flip bit 0 off.

Old Mask

ID Bit:	   4        32        10 
Mask: -xxX xxxx xx32 xxxx xx10 xxxx xxxx xxxx
AND-0: FF FF FF FF
1111 1111 1111 1111 1111 1111 1111 1111

OR-0: 00 00 00 00
0000 0000 0000 0000 0000 0000 0000 0000

AND-1: FF FF FF FF
1111 1111 1111 1111 1111 1111 1111 1111

OR-1: 00 00 00 00
0000 0000 0000 0000 0000 0000 0000 0000

Now we reference the bits under ID Bit colums 3,2,1,0 for each mask.

Original Strap Math:

HW-Strap	:	0001
Strap0-AND : 1111
Results in : 0001
Strap0-OR : 0000
Results in : 0001

So to turn off bit 0 we can disable the last bit of the Strap0-AND mask.

That gives us 1110 which we must fit into our new Mask.

New Mask

ID Bit:	   4        32        10 
Mask: -xxX xxxx xx32 xxxx xx10 xxxx xxxx xxxx
AND-0: FF FF EF FF
1111 1111 1111 1111 1110 1111 1111 1111

OR-0: 00 00 00 00
0000 0000 0000 0000 0000 0000 0000 0000

AND-1: FF FF FF FF
1111 1111 1111 1111 1111 1111 1111 1111

OR-1: 00 00 00 00
0000 0000 0000 0000 0000 0000 0000 0000

Masked Strap Math

HW-Strap	:	0001
Strap0-AND : 1110
Results in : 0000
Strap0-OR : 0000
Results in : 0000

Strap code

nvflash --straps 0xFFFFEFFF 0x00000000 0xFFFFFFFF 0x00000000

But alas we cannot use this command on GF4 cards since nvflash is dumb and cannot handle straps that don't start with 7F (All of GF4 uses straps that start with FF)
nvflash --straps command will not work and you will manually have to edit your bios straps.
So hop into your fav hex editor (I used RVB Edit / X-BIOS Editor) and wrote the new mask down using endian notation.

 FF EF FF FF 00 00 00 00 FF FF FF FF 00 00 00 00 

unknown.png
unknown.png

Then save and flash your new hacked BIOS onto the card.

And since I didn't do this on my own card, but rather helped someone else do it, i'll leave it up to them to show the results 😀

Here's a quick proof

20200822_163523.jpg?width=1214&height=683

Last edited by 0xCats on 2020-08-22, 19:14. Edited 3 times in total.

There are two types of devices, those that know they've been hacked and those that don't yet know they're going to be hacked.

Reply 108 of 147, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie
slivercr wrote on 2018-05-02, 12:29:

On the weekend a new cooler arrived and I've been trying it on. I decided to "downgrade" the card to a GeForce FX 5800 (still an upgrade from the Quadro FX 1000) in order to keep it as a single slot card and avoid heat issues with the VRAM and the RDRAM so close to each other.
IMG_20180502_140956.jpgIMG_20180502_141023.jpg
The cooler does a great job and is very silent. I suspect my success with such small coolers has to do with the CPU used in my machine, with a more powerful CPU the card would be able to stretch its legs a bit more and it would get way hotter (I see this effect now when benchmarking a Coppermine 1000 vs a Tualatin 1400)

this cooler, I think I have the exact same one somewhere, it came originally in one 8600GT DDR2 from ECS in my case, I remember it being competent at cooling that card even with OC (think it was to 610Mhz),

I'm thinking, if it can cool a "FX 5800" surely it can cool my FX5900SE? (400MHz) because that thing (from EVGA) also has a noisy fan, and the worst part is that it doesn't offer any temp readings, the fan on startup runs really loud (100%), then it goes down during the boot process and with some driver/OS it also only boosts the speed when running 3d but unfortunately on 98 with 45.23 it's stuck at 100% all the time (and aida reports 100% fan speed and something like 4500 RPM), I don't think that it's something that can be software controlled.

I wonder if you are still running this combination (quadro + cooler)?

actually a little OT but this also reminds me, around 2004 I had another 5900SE from evga (exact same model of the current one), and at some point I downloaded some 5950U modded bios for it from a forum (guru3d I think), and I remember it worked well at 475MHz (the bios bumped the vcore), and the memory I think I could run at 900 due to looser timings (impossible with the stock bios), this thread reminds me of that, but there is another detail that I find interesting but my memory is not 100% on this, with the stock bios it didn't display a temp sensor, but with the modded 5950u bios it actually worked...

edit> riva tuner can control the fan speed just fine! might be the same with the quadro, it's just that in my case, without a temp sensor it seems risky to be playing with it.

Last edited by SPBHM on 2020-08-26, 07:24. Edited 1 time in total.

Reply 109 of 147, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

Here's a rundown of what's up.

Card in question is this one :

Front.jpg
Filename
Front.jpg
File size
662.68 KiB
Views
1874 views
File license
GPL-2.0-or-later
Back.jpg
Filename
Back.jpg
File size
658.32 KiB
Views
1874 views
File license
GPL-2.0-or-later

It's (apparently) a OC'ed 4200 Ti. Default clocks : 275MHz Core, 600MHz memory.
Because it looks like 4800 Ti PCB, I figured it should be possible to mod it.
Using previously mentioned method, we succesfully exchanged IDs of this card 😀
Here's how it was recognised (before restart after flashing) :

Name before.png
Filename
Name before.png
File size
852.74 KiB
Views
1874 views
File license
GPL-2.0-or-later

Performance wise, it doesn't matter if you got OC 4200 Ti "8x" to 4600 Ti speed or have actual 4800 Ti.

3DMark 01.png
Filename
3DMark 01.png
File size
434.28 KiB
Views
1874 views
File license
GPL-2.0-or-later

vs.

3DMark 01.png
Filename
3DMark 01.png
File size
452.35 KiB
Views
1874 views
File license
GPL-2.0-or-later

The only difference between those is DeviceID (can be seen in GPU-z).

Other than that, it matters for Futuremark Systeminfo 😁
Here's comparison of 3DMark 03 two scores (4200 Ti Agp 8x OC vs. 4800 Ti):
https://www.3dmark.com/compare/3dm03/6546811/3dm03/6546816

I would like to thank 0xCats for his help on this - couldn't have done it without him 😀

157143230295.png

Reply 110 of 147, by 0xCats

User metadata
Rank Newbie
Rank
Newbie

Modding GeForce FX5800 to FX5800 Ultra (this really is only meant for those of you that want to test you golden chip 5800's)
I like to fully document the process for archival/long term reference.

original straps on FX5800

FF FF FF 7F 00 00 00 00 FF FF FF 7F 00 00 00 80

FX5800 to FX5800 Ultra Strap modding

Device ID's

GF FX 5800  = 0302
GF FX 5800U = 0301

What we need to modify:

HW Strap
ID-Bit: 4 3210
0301 = 0000 0011 0000 0001
0302 = 0000 0011 0000 0010
Change = xxxx xxxx xxxx xx01

So we want to flip bit 0 on and bit 1 off.

Bit Mask

ID Bit:	   4        32        10 
Mask: -xxX xxxx xx32 xxxx xx10 xxxx xxxx xxxx
AND-0: 7F FF DF FF
0111 1111 1111 1111 1101 1111 1111 1111

OR-0: 00 00 10 00
0000 0000 0000 0000 0001 0000 0000 0000

AND-1: 7F FF FF FF
0111 1111 1111 1111 1111 1111 1111 1111

OR-1: 00 00 00 00
0000 0000 0000 0000 0000 0000 0000 0000

Strap Math:

HW-Strap	:	0010
Strap0-AND : 1101
Results in : 0000
Strap0-OR : 0001
Results in : 0001


Straps for converting FX5800 to FX5800 Ultra

nvflash --straps 0x7FFFDFFF 0x00001000 0x7FFFFFFF 0x00000000

There are two types of devices, those that know they've been hacked and those that don't yet know they're going to be hacked.

Reply 111 of 147, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie

Here are some test results, contrasting the performance differences between when the card is run as a Quadro, and when it is run as a GeForce.

The tests were conducted with the 56.64 drivers. 45.23 would have been preferable, but the 5700 series cards will not work with 45.23, so the 56.64 drivers were necessary in order to obtain an apples-to-apples comparison. Otherwise, besides for the purpose of benchmarking, I would avoid the FX 1100/5700 series cards altogether for this reason alone, since many older games require driver 45.23 or earlier (and in some cases, a TI4600 is preferable for its very early driver compatibility which is also required for further select games).

The benchmark software in this case was Quake 3, updated to 1.32C with all details set to max and the resolution at 1600x1200x32. The nVidia driver preferences were set to default.

Changing from Quadro to GeForce and vice versa was very conveniently done with RivaTuner 2.24.

GeForce FX 5800 - 210.6 (400/800)
Quadro FX 1000 - 193.9 (400/800)

GeForce FX 5700 - 163.5 (425/650)
Quadro FX 1100 - 135.7 (425/650)

GeForce FX 5700LE - 65.5 (225/366)

Geforce4 TI4600 - 176.6 (300/650)
Quadro4 900XGL - 175.9 (300/650)

Geforce4 MX440 - 81.1 (270/200)
Quadro4 550XGL - 81.1 (270/200)

A few takeaways:

1) The Quadro performance hit only seems to affect the GeForce FX cards and not GeForce 4.
2) The TI4600 is not nearly as fast as the GeForce 5800, which is contrary to popular belief. Rather, it sits closer to the GeForce 5700 performance-wise
3) The GeForce FX 5700LE is even slower than a previous generation GeForce4 MX440, so it should be avoided.
4) The GeForce 4 MX440 is still a largely-available, great budget choice for an old build. Performance can be further increased by dropping to 16-bit color depth (it is worth noting that this performance increase was not possible with Radeon cards because of the architecture)

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 112 of 147, by slivercr

User metadata
Rank Member
Rank
Member
mockingbird wrote on 2020-12-02, 16:47:
... A few takeaways: […]
Show full quote

...
A few takeaways:

1) The Quadro performance hit only seems to affect the GeForce FX cards and not GeForce 4.
2) The TI4600 is not nearly as fast as the GeForce 5800, which is contrary to popular belief. Rather, it sits closer to the GeForce 5700 performance-wise
3) The GeForce FX 5700LE is even slower than a previous generation GeForce4 MX440, so it should be avoided.
4) The GeForce 4 MX440 is still a largely-available, great budget choice for an old build. Performance can be further increased by dropping to 16-bit color depth (it is worth noting that this performance increase was not possible with Radeon cards because of the architecture)

I didn't expect #2 to be a thing, let alone popular belief.
Fully agree with you on the mx440s, they are great for P3 systems.

Also, maybe edit your post with the specs of the testbench, just so the post is useful as a reference point for other people.

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 113 of 147, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie
slivercr wrote on 2020-12-02, 19:03:

I didn't expect #2 to be a thing, let alone popular belief.

Hi, good to see you 😀

Here is an example of this claim on this forum:

Iris030380 wrote on 2019-06-08, 11:03:

A standard 5900 is very similar in performance to a Ti4600 in DirectX 8 and OpenGL titles if I remember. The only time the 5900 will grow more legs would be with AA applied. The FX5900 Ultras are a bit faster but still not a huge gap. DirectX 9 would be an entirely different story though.

The 5800/5900 has better raw performance than the TI4600, AA or no AA...

Fully agree with you on the mx440s, they are great for P3 systems.

...with the stipulation that it's a 128-bit MX440 and not a 64-bit MX440 or a MX440SE.

Also, maybe edit your post with the specs of the testbench, just so the post is useful as a reference point for other people.

The original idea was to run them on a BX/Tualatin 1.4 system, but the CPU limited the GPU and they all maxed out at 150FPS or so... High-clock Athlon XP or P4 systems would probably also do the trick, but in this case, the tests were run on an 865/E6420 (2.13Ghz) platform.

That's not to say that a 5800 or better isn't necessarily required for a Tualatin build... To the contrary - if AA and other fancy features are enabled, the card will very much be pushed to its limit, regardless of the CPU.

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 114 of 147, by slivercr

User metadata
Rank Member
Rank
Member
mockingbird wrote on 2020-12-02, 21:46:

Hi, good to see you 😀

Thanks!

mockingbird wrote on 2020-12-02, 21:46:

...with the stipulation that it's a 128-bit MX440 and not a 64-bit MX440 or a MX440SE.

Oh for sure, full bus width or nothing! I wrote "mx440s" for plural, I wasn't even thinking about the se.

mockingbird wrote on 2020-12-02, 21:46:

Here is an example of this claim on this forum:

Iris030380 wrote on 2019-06-08, 11:03:

A standard 5900 is very similar in performance to a Ti4600 in DirectX 8 and OpenGL titles if I remember. The only time the 5900 will grow more legs would be with AA applied. The FX5900 Ultras are a bit faster but still not a huge gap. DirectX 9 would be an entirely different story though.

The 5800/5900 has better raw performance than the TI4600, AA or no AA...

😮 I suspect they were CPU bound 🤷🏾

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 115 of 147, by slivercr

User metadata
Rank Member
Rank
Member

Just popped into ebay to see what the prices are for Quadro vs GeForce 5800U these days.
Its amusing that literally 1 bit—the different strap needed to change identity—will make it an order of magnitude more expensive.

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 116 of 147, by Mamba

User metadata
Rank Oldbie
Rank
Oldbie

I was lucky enough to get one FX3000 in good conditions.
I am testing it extensively and I get around 6000point in 3dMark01 with a single P-III 700Mhz (100Mhz FSB) on a p2b-D.
Is it worth to change it to a Geforce?

I mean... : https://www.youtube.com/watch?v=V4tOBgoMuwM

I am a bit confused by this video and your findings and can't say I am not attracted by the mod itself.
Trut is, I do not want to screw everything.

Is it simpler to just overclock it?
The FX3000 cooler is very good anyway.

Reply 117 of 147, by slivercr

User metadata
Rank Member
Rank
Member
Mamba wrote on 2021-01-12, 16:27:
I was lucky enough to get one FX3000 in good conditions. I am testing it extensively and I get around 6000point in 3dMark01 with […]
Show full quote

I was lucky enough to get one FX3000 in good conditions.
I am testing it extensively and I get around 6000point in 3dMark01 with a single P-III 700Mhz (100Mhz FSB) on a p2b-D.
Is it worth to change it to a Geforce?

I mean... : https://www.youtube.com/watch?v=V4tOBgoMuwM

I am a bit confused by this video and your findings and can't say I am not attracted by the mod itself.
Trut is, I do not want to screw everything.

Is it simpler to just overclock it?
The FX3000 cooler is very good anyway.

Earlier in this thread you can find my opinion on that video.

Its really up to you to decide if its "worth it". I suggest you use RivaTuner to change the id, then testing games you want to play and see for yourself.

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 118 of 147, by Mamba

User metadata
Rank Oldbie
Rank
Oldbie
slivercr wrote on 2021-01-12, 16:59:
Mamba wrote on 2021-01-12, 16:27:
I was lucky enough to get one FX3000 in good conditions. I am testing it extensively and I get around 6000point in 3dMark01 with […]
Show full quote

I was lucky enough to get one FX3000 in good conditions.
I am testing it extensively and I get around 6000point in 3dMark01 with a single P-III 700Mhz (100Mhz FSB) on a p2b-D.
Is it worth to change it to a Geforce?

I mean... : https://www.youtube.com/watch?v=V4tOBgoMuwM

I am a bit confused by this video and your findings and can't say I am not attracted by the mod itself.
Trut is, I do not want to screw everything.

Is it simpler to just overclock it?
The FX3000 cooler is very good anyway.

Earlier in this thread you can find my opinion on that video.

Its really up to you to decide if its "worth it". I suggest you use RivaTuner to change the id, then testing games you want to play and see for yourself.

Damn...
I knew I would receive this answer...

I will try Rivatuner, hoping I won't destroy it

Reply 119 of 147, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie

I strongly recommend you underclock the memory of the FX3000 to at least 200Mhz below spec. The DDR in the FX3000 is notorious for not lasting long at stock clocks. This does not apply to Quadro 1000 and 2000 which used vastly improved DDR2.

I would advise doing this via a BIOS mod rather than through software because the card will run at stock clock until the software applies the changes, well after the boot process completes.

mslrlv.png
(Decommissioned:)
7ivtic.png