VOGONS


The story of the QuForce FX 5800 Ultra...

Topic actions

Reply 120 of 147, by Mamba

User metadata
Rank Oldbie
Rank
Oldbie
mockingbird wrote on 2021-01-12, 19:11:

I strongly recommend you underclock the memory of the FX3000 to at least 200Mhz below spec. The DDR in the FX3000 is notorious for not lasting long at stock clocks. This does not apply to Quadro 1000 and 2000 which used vastly improved DDR2.

I would advise doing this via a BIOS mod rather than through software because the card will run at stock clock until the software applies the changes, well after the boot process completes.

Have you a modded bios for me?

Reply 122 of 147, by Mamba

User metadata
Rank Oldbie
Rank
Oldbie
mockingbird wrote on 2021-01-12, 19:54:

If you dump yours I can do it for you, but it should be pretty simple to do it yourself.

Do not underestimate my stupidity.
Will try to send you the file tomorrow, thank you so much!

Reply 124 of 147, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie
mockingbird wrote on 2021-01-12, 19:11:

I strongly recommend you underclock the memory of the FX3000 to at least 200Mhz below spec. The DDR in the FX3000 is notorious for not lasting long at stock clocks. This does not apply to Quadro 1000 and 2000 which used vastly improved DDR2.

I would advise doing this via a BIOS mod rather than through software because the card will run at stock clock until the software applies the changes, well after the boot process completes.

I should clarify that the Quadro 1000 and 2000 may not be that much better of an improvement with regard to memory quality:

https://en.wikipedia.org/wiki/DDR2_SDRAM#Rela … _to_GDDR_memory

"GDDR2, a form of GDDR SDRAM, was developed by Samsung and introduced in July 2002. The first commercial product to claim using the "DDR2" technology was the Nvidia GeForce FX 5800 graphics card. However, it is important to note that this GDDR2 memory used on graphics cards is not DDR2 per se, but rather an early midpoint between DDR and DDR2 technologies. Using "DDR2" to refer to GDDR2 is a colloquial misnomer. In particular, the performance-enhancing doubling of the I/O clock rate is missing. It had severe overheating issues due to the nominal DDR voltages. ATI has since designed the GDDR technology further into GDDR3, which is based on DDR2 SDRAM, though with several additions suited for graphics cards.

GDDR3 and GDDR5 is now commonly used in modern graphics cards and some tablet PCs. However, further confusion has been added to the mix with the appearance of budget and mid-range graphics cards which claim to use "GDDR2". These cards actually use standard DDR2 chips designed for use as main system memory although operating with higher latencies to achieve higher clockrates. These chips cannot achieve the clock rates of GDDR3 but are inexpensive and fast enough to be used as memory on mid-range cards.

So are the FX2000 and FX1000 that much better than the FX3000 with regard to memory quality? It comes down to whether Samsung "GDDR2" is better than Hynix GDDR. Neither are based on DDR2.

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 125 of 147, by gmipf

User metadata
Rank Newbie
Rank
Newbie

I have finally got a Quadro FX1000 and want to overclock it to 500/500Mhz but need a good cooling solution before attempting the OC and bios mod to FX5800U. Can someone recommend a good GPU cooler which fits on this card? I have considered the Prolimatech MK-26 but I'm not sure if the screw holes will fit? And any recommendation for memory cooling? The FX1000 essentially doesn't have any memory cooling.

Reply 126 of 147, by gmipf

User metadata
Rank Newbie
Rank
Newbie
slivercr wrote on 2018-03-21, 16:10:
Its true that they are rare. Around the web you find a figure of 100000 total NV30 chips, half of which were for Quadros, the re […]
Show full quote
chose007 wrote:
OK thanks for link. This was 10 years back cooling mod for daily using with 120mm fans - because noise from original cooler of c […]
Show full quote

OK thanks for link.
This was 10 years back cooling mod for daily using with 120mm fans - because noise from original cooler of course.
Today I have several FX5800 cards but not planing any voltage mods, only tests or repairs. So hard find any piece today, voltage mod should make on FX5600/5700/5900 cards.

Measuring require running card, give me time. Actualy haven't place for other setup on my table 😁

Its true that they are rare. Around the web you find a figure of 100000 total NV30 chips, half of which were for Quadros, the rest divided between OEMs. I don't know if it's true, but that is a SMALL number.

Rare and all, I did a voltage mod on this one: stock DDR voltage for this card was 2.3V, as far as I can tell from looking online and looking at a datasheet for "similar" Samsung chips, they should run at 2.5V. I brought them up to 2.48V, which gave me some stability when clocking them beyond 400 MHz. Easy soldering to some test points that were nearby, easily reversible too.
vddr_mod.jpg

I would like to do this mod on my Quadro FX1000. Which value does the resistor has? I'm not good at reading those stripes. And do I have to remove something else beforehand? Do you just bridge from a 3.3V line and decrease it with the resistor to 2.48V? I also want to delid the NV30 and use Thermal Grizzly Hydronaut as the thermal paste. I already found and ordered a (somewhat rare?) Zalman VF1000-LED fully in box. I also need to figure out which thermal conductive double sided tapes I want to get for the RAM cooling, those included by Zalman shouldn't be that great I think.

EDIT: On the FX1000 are also 2 electrolytic caps missing, which are present on the F58000(U) and FX2000. Are they the same as the other pink cans? I also want to install those missing caps. I think they are for stabilizing the GPU voltage?

Reply 127 of 147, by slivercr

User metadata
Rank Member
Rank
Member
gmipf wrote on 2021-05-05, 14:30:

I would like to do this mod on my Quadro FX1000. Which value does the resistor has? I'm not good at reading those stripes. And do I have to remove something else beforehand? Do you just bridge from a 3.3V line and decrease it with the resistor to 2.48V? I also want to delid the NV30 and use Thermal Grizzly Hydronaut as the thermal paste. I already found and ordered a (somewhat rare?) Zalman VF1000-LED fully in box. I also need to figure out which thermal conductive double sided tapes I want to get for the RAM cooling, those included by Zalman shouldn't be that great I think.

EDIT: On the FX1000 are also 2 electrolytic caps missing, which are present on the F58000(U) and FX2000. Are they the same as the other pink cans? I also want to install those missing caps. I think they are for stabilizing the GPU voltage?

8.2 kOhm. Nothing to remove, solder to those specific points.

Caps should be the same as the ones next to the empty spots: 1500 uF, Low ESR.

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 128 of 147, by gmipf

User metadata
Rank Newbie
Rank
Newbie

Thanks for the info. After considering again I came to the conclusion that I don't really have much space for a two slot config. Can you recommend any single slot cooling solution for an overclocked FX1000?

Reply 129 of 147, by slivercr

User metadata
Rank Member
Rank
Member
gmipf wrote on 2021-07-02, 20:48:

...
Can you recommend any single slot cooling solution for an overclocked FX1000?

To overclock to Ultra levels, no. I would not do it without a BIG heatsink and proper RAM cooling.

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 130 of 147, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

Have you got overhead room? Because on the 6000s and 7000s there were some heatpipe deallies that had a radiator at 90 degrees overhead, but at the slot they were only 1 slot wide.

Edit: I was thinking of the Asus ones https://www.anandtech.com/show/2073/2 but something similar for aftermarket was the Cooler Master CoolVivaZ1 but it looks like the heatsink fattens it to 2 slot. However that could be cut down, and a fan put on the radiator, or aimed directly at it, if your blowhole is near. There are also a lot of "backpack" kind of heatpipes the hook over the card and have the radiator on the backside, where there might not be cards interfering, but could run into other obstacles, CPU cooler etc.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 131 of 147, by Blackbird84

User metadata
Rank Newbie
Rank
Newbie
slivercr wrote on 2021-07-02, 21:42:
gmipf wrote on 2021-07-02, 20:48:

...
Can you recommend any single slot cooling solution for an overclocked FX1000?

To overclock to Ultra levels, no. I would not do it without a BIG heatsink and proper RAM cooling.

Thank you. I love this thread.

So physically is the quadro fx1000 the same as an fx2000 except for the heatsinks?

by the way, did you reached to modify the quadro fx4400 in geforce?

Thanks you!!

Reply 132 of 147, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie
Blackbird84 wrote on 2022-02-24, 21:53:

Yes and no... Some FX1000 come with K4N26323AE-GC22K (450Mhz)... If yours does, then don't exceed 900Mhz (450Mhz "DDR") with it... Most others come with "GC1K" memory... We don't know exactly what that is, but Samsung specifies "GC20" as 500Mhz, so GC1K is at least capable of that.

Still, I keep my memory at 800Mhz (400Mhz), because GDDR2 wasn't very resilient in the long run.

The only other difference I know besides the heatsink are the two missing 2.5V 1500uF capacitors, but you can add those as well.

But you will need to re-do the cooling on your card. Here is how I did it:

IMG_1750.JPG
Filename
IMG_1750.JPG
File size
297.91 KiB
Views
1425 views
File license
Public domain
IMG_1751.JPG
Filename
IMG_1751.JPG
File size
373.83 KiB
Views
1425 views
File license
Public domain

This card is used in my Windows 98 "fast" machine (865G AGP machine with Pentium E5800 @ 3.2Ghz, DirectX 9).

For my Windows 98 "slow" machine I went with a Geforce3 because I want to use older nVidia drivers with it (12.xx) for very old game compatibility (DirectX 6).

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 133 of 147, by Blackbird84

User metadata
Rank Newbie
Rank
Newbie
mockingbird wrote on 2022-02-25, 01:06:
Yes and no... Some FX1000 come with K4N26323AE-GC22K (450Mhz)... If yours does, then don't exceed 900Mhz (450Mhz "DDR") with i […]
Show full quote
Blackbird84 wrote on 2022-02-24, 21:53:

Yes and no... Some FX1000 come with K4N26323AE-GC22K (450Mhz)... If yours does, then don't exceed 900Mhz (450Mhz "DDR") with it... Most others come with "GC1K" memory... We don't know exactly what that is, but Samsung specifies "GC20" as 500Mhz, so GC1K is at least capable of that.

Still, I keep my memory at 800Mhz (400Mhz), because GDDR2 wasn't very resilient in the long run.

The only other difference I know besides the heatsink are the two missing 2.5V 1500uF capacitors, but you can add those as well.

But you will need to re-do the cooling on your card. Here is how I did it:

IMG_1750.JPG
IMG_1751.JPG

This card is used in my Windows 98 "fast" machine (865G AGP machine with Pentium E5800 @ 3.2Ghz, DirectX 9).

For my Windows 98 "slow" machine I went with a Geforce3 because I want to use older nVidia drivers with it (12.xx) for very old game compatibility (DirectX 6).

Thanks you!!!

the version i bought, has GC1k memory chips. by the way between the fx1000/2000 with GC1K memories and geforce fx5800ultra, are there more differences? or were they the same?

Reply 134 of 147, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

They are technically the same, but Ultra probably had higher binned GPU chips, proven to work on 500 Mhz. I think, NV30GL (Quadro FX 1000/2000) chips also had some enabled features in silicon which can't be replicated on GeForce (NV30) chips via software modding, should you want to mod FX5800/5800 Ultra into Quadro, but it only affects professional OpenGL software. And this thread is about doing it backwards.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 135 of 147, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

I 'member hearing back in the day, with earlier quadro/Gf that quadro did the OpenGL the super correct and fully featured way, and Geforce did it the quick and dirty way for game speed, and you had to switch between them (there was a jumper zero ohm link on the board) and have the right BIOS to be either fast or professional.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 136 of 147, by chrismeyer6

User metadata
Rank l33t
Rank
l33t

I remember hearing about that with Quadro vs GeForce cards back in the day. There were articles on Tom's hardware, Anandtech, and guru3d as well as tons of forum topics about this as well.

Reply 138 of 147, by slivercr

User metadata
Rank Member
Rank
Member
avenger_ wrote on 2022-11-07, 10:51:
Thanks, that was very helpful! […]
Show full quote

Thanks, that was very helpful!

I cooked something a bit louder ^^

GQ0801s.jpg

Is that a QFX 1000 with what, an FX 5700 heatsink? All hail the Neo Dustbuster!
EDIT: oh its just the stock QFX 1000 heatsink! Are the temps ok? Is it LOUD? 🤣

This is just awesome, you made my day.

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 139 of 147, by avenger_

User metadata
Rank Newbie
Rank
Newbie

Yes, it's very LOUD 😁 (but only in 3D, just like original FlowFX - first semi-passive GPU cooling system 😉).

That's original FX1000 heatsink (it was glued so I decided to leave it), a blower and 3D-printed airduct. Temps are ok - ~55C at 1,4V in stress.

Credits to Gainward for inspiration 😀

m3QzlMd.jpg