VOGONS


The story of the QuForce FX 5800 Ultra...

Topic actions

Reply 140 of 166, by pentiumspeed

User metadata
Rank l33t
Rank
l33t
avenger_ wrote on 2022-11-07, 10:51:
Thanks, that was very helpful! […]
Show full quote

Thanks, that was very helpful!

I cooked something a bit louder ^^

GQ0801s.jpg

Cover the GPU heatsink fins partially to keep air routed through the fins longer.

Cheers,

Great Northern aka Canada.

Reply 141 of 166, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

You can test your ducting mods with cardboard and masking tape to see what works well. Just use good masking tape, as the cheapy white stuff will curl up and fall off when it gets warm.

Last time I was doing things of that sort, I ended up using galvanised duct sheeting, IDK whether it was because I just had some, or I was trying to be funny, but it worked okay. I had several pairs of tin snips to make working with it easy though.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 142 of 166, by avenger_

User metadata
Rank Newbie
Rank
Newbie

Thanks for the suggestions. I'm going to tweak this duct to better fit the outlet on the bracket.
My FX can't do 500/1000 MHz even at 1.5V, I would probably have to solder the missing caps and make a GDDR2 vmod. So currently the card is modded to FX5800 non-ultra (400/800) and the highest temperature I have seen is 62C in Doom 3.

Do you want to hear how loud it is? 😉

Reply 145 of 166, by chrismeyer6

User metadata
Rank l33t
Rank
l33t

She's definitely loud but I'd bet the temps are good though.

Reply 146 of 166, by slivercr

User metadata
Rank Member
Rank
Member
avenger_ wrote on 2022-11-13, 23:57:

OK, so prepare your earplugs 😉

Spoiler

https://www.youtube.com/watch?v=InIic3oTu7w

I can't explain why I like this so much 🤣

Nice system, btw. Socket 478 and Rambus go well together.

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 147 of 166, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

OK, so prepare your earplugs

JETSPEED!

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 148 of 166, by acl

User metadata
Rank Oldbie
Rank
Oldbie

Thank you very much for the walkthrough.
I successfully turned my Quadro FX1000 into a FX5800U

I used a Zalman VF1000 as cooler (which is compatible) and some RAM heatsinks (+ thermal glu)

The attachment IMG_20240602_213812.jpg is no longer available

The mounting holes are really close to some SMD components so i used a fiber washer around the hole.
I use electric tape to insulate the components because the Zalman screws have metal springs attached and i wanted to prevent any interaction with the components/PCB.

The attachment IMG_20240602_215737.jpg is no longer available

I glued the RAM heatsinks (mostly hidden behind the zalman cooler)

Final result :

The attachment IMG_20240603_015935.jpg is no longer available
The attachment IMG_20240603_015958.jpg is no longer available
The attachment GPUZ.gif is no longer available

Temperature mostly stays in the 55-60C° under heavy load (HL2 maxed @ 1920 x 1200 (16:10) )

"Hello, my friend. Stay awhile and listen..."
My collection (not up to date)

Reply 149 of 166, by slivercr

User metadata
Rank Member
Rank
Member

Nice!

Watch the temp on those RAM chips in the back, the stock cooler for a 5800U is huge and gets really hot, Im afraid the ones you used may be overwhelmed.

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 150 of 166, by acl

User metadata
Rank Oldbie
Rank
Oldbie
slivercr wrote on 2024-06-03, 14:07:

Nice!

Watch the temp on those RAM chips in the back, the stock cooler for a 5800U is huge and gets really hot, Im afraid the ones you used may be overwhelmed.

I don't have an infrared thermometer but during the 3DMark runs and my HL2 test play i touched the back heatsinks several times and they were not that hot (less than 50° in my opinion but i'm testing on an open test bench and the proximity of the CPU cooler probably helps a bit too). Its nothing near the heat of a Voodoo3 for example (where it's painful to touch the dissipator)

I'm still considering adding a smaller fan on the back, just to be safe (like on some FX5900 from MSI)

"Hello, my friend. Stay awhile and listen..."
My collection (not up to date)

Reply 151 of 166, by Masterchief79

User metadata
Rank Newbie
Rank
Newbie

Want to add my FX1000 -> FX5800 Ultra conversion on here. First of all, big thanks for the collection of resources and the tutorial, couldn't have done it otherwise.
My card has GC20 memory so I was a bit skeptical about reaching Ultra clocks.
I got a New in Box Zalman VF-900 LED and some copper memory coolers for it. On pic 3 I marked the additional 10K resistor I added on top of the original 730 Ohm to raise the memory voltage from 2,3 to 2,5V.

It turned out that I shouldn't have worried. My card clocked 460MHz with Quadro-Timings and a suprisingly high 570MHz with Ultra-Timings. So GC20 should still be totally fine for the conversion, as long as you raise the voltage, cool it well, and most importantly, adjust the timings in the BIOS! Pic 5 - Quadro Timings left, Ultra Timings right.

The attachment AGC_20240706_181937087.jpg is no longer available
The attachment AGC_20240706_181954031.jpg is no longer available
The attachment IMG_20240706_182243459_2.jpg is no longer available
The attachment Timings.JPG is no longer available
The attachment 01 570.JPG is no longer available

Reply 152 of 166, by e8root

User metadata
Rank Newbie
Rank
Newbie

Those Zalman coolers look amazing on this card.
I will definitely do the BIOS/resistor mod on my FX1000
That said I don't see the point of converting card from Quadro FX1000 to GeForce FX5800Ultra because none of the tests I did strapping GPU with NVStrap driver (which fools not only Windows but also driver itself when strapping PCI option is enabled) showed any difference in performance - not in Direct3D and not even in OpenGL.

But yeah, I guess it is all done probably to fullfil past dreams of owning FX800Ultra and I get that. That said for me it was actually other way around and I wanted to have Quadro cards and not these pesky cheap gaming GeForce cards 😉

BTW. Would just having GeForce timings be enough to get 500MHz on memory or is the resistor absolutely needed?

Reply 153 of 166, by Masterchief79

User metadata
Rank Newbie
Rank
Newbie

Thank you! FX5800U are a lot more desirable today, I guess. Performance-wise it doesn't make the slightest difference, as long as you adjust clocks and timings obviously.

If you need the memory voltage mod or not depends entirely on silicon lottery. If your card can do around 450MHz memory with the original quadro timings and no mods, there's a a good chance you don't need it. Of couse the Quadro cards came with GC1K, GC20 and GC22 chips so those should clock wildly different... But not even that is a clear indication as my card with GC20 clocks higher than any GC1K card that I've seen so far.

Also, from other people who have done this mod and binned cards, they already have quite a big variance in memory clockability anyway even if the chips are the same. I've seen 380-460MHz without mods. So, just gotta have to test it and find out!

Reply 154 of 166, by sunmax

User metadata
Rank Newbie
Rank
Newbie

I wonder if some of the science in this thread could be applied to this scenario:

- I got a Quadro FX600 PCI, I'm happy for it to be a Quadro (dual DVI-I, faster GL, etc.), but its VBE implementation really sucks. Quake can only run at VGA resolution, or will crash unable to load VESA palette, and when some SVGA DOS games start they flicker intensely, and you need NOLFB (or similar), and then everything is tearing, etc. Ok long story short: the VBE BIOS implementation of this Quadro is flakey

- I also got a FX5500 PCI (full specs: 128bit, RAM at 2 x 200Mhz, etc), with a solid VBE implementation, but is single head (VGA), and its GL performance (e.g. GLquake) quite slower than the Quadro on same system and with same drivers

Question: could we modify the BIOS of the FX5500 to make it a Quadro BIOS and then flash it on the FX600 ? Would we lose one head since the FX5500 is not dual-head ?

I.e we would proceed as if we were modding the FX5500 into a Quadro FX600, just the modified BIOS would be loaded on the actual FX600.

Or is there a better way to transplant the FX5500 VBE implementation into the Quadro FX600 BIOS ?

The 2 cards spec-wise seem close enough to make it sound possible.

Thanks!

Reply 155 of 166, by Spongeebubu™

User metadata
Rank Newbie
Rank
Newbie

Is this possible on 6 series cards too? I'm trying to change the device ID on my Quadro FX 4400 to that of a 6800 Ultra, purely for the fun of it.

Reply 156 of 166, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie
slivercr wrote on 2018-04-04, 17:27:
[NV34] Quadro FX 500/600 (032B) (Not supported by NiBiTor!) […]
Show full quote

[NV34] Quadro FX 500/600 (032B) (Not supported by NiBiTor!)

Straps
0050: xx xx xx xx xx xx xx xx 3F D0 30 7E 80 20 C0 80
0060: 00 00 00 00 10 00 00 80 xx xx xx xx xx xx xx xx

...to GeForce FX 5200 Ultra (0321)
nvflash --straps 0x7E10D03F 0x80C00080 0x00000000 0x00000010

...to GeForce FX 5200 (0322)
nvflash --straps 0x7E10C03F 0x80C02080 0x00000000 0x00000010

I am getting "strap value out of range" when running nvflash --straps for the FX5200. Any suggestions please? I already modded the BIOS with nibitor and it changed it to 0322 when I selected FX5200 from the list. I also flashed it.

You state that Nibitor won't change the Device ID, but looking at the two ROMs, it appears that it does.

Another thing: Upon reboot, the card does in fact post, eventhough the BIOS was flashed and the straps were not changed.

The only other thing I changed were the clocks. But I'll try to flash back the modded version without changing anything besides the device ID and see if that helps.

edit:flashed the original version with only device ID modded, but got the same error. I also confirmed in a hex editor -- nibitor only changed the device ID and checksum. Windows still reports the card as an FX500.
edit 2: I was able to manually change the straps in the BIOS, but it seems the calculation you made was incorrect, because I'm getting device ID 0329 instead of 0322.
edit 3: Ok, I think I see my mistake, gonna try again.
edit 4:No, fixing it did not help, true it was set incorrectly, but I did set it correctly and it is displaying as such with the '--display 112' command:

0050: E9 E7 E4 00 DE 10 BA 01 3F C0 10 7E 80 20 C0 80
0060: 00 00 00 00 10 00 00 00 22 00 A5 0C E9 D0 0D E9

but I'm still stuck on Device ID 0329. Any ideas?

The attachment FX500.zip is no longer available
Last edited by mockingbird on 2025-02-18, 06:07. Edited 5 times in total.

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 157 of 166, by dm-

User metadata
Rank Member
Rank
Member

device id for fx5200-5500 fx500 cards are physical strap resistors on the board

Reply 158 of 166, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie
dm- wrote on 2025-02-18, 04:24:

device id for fx5200-5500 fx500 cards are physical strap resistors on the board

From the OP's original post:

"So, the gist of it is the straps can be changed either in hardware, by moving resistors around; or in software, by adjusting the AND / OR straps in the BIOS. The approaches are IDENTICAL in result. In my opinion, modifying the software straps has 1 advantage: it can be reversed immediately without having to move resistors back to their original position."

This is not applicable then?

I did have success with RivaTuner, it can modify the card to a "5200LE", but that's no issue since the actual clocks are what matter (and there is no actual difference in silicon between an FX5200LE and an FX5200).

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 159 of 166, by dm-

User metadata
Rank Member
Rank
Member

for older cards you have to move strap resistors