Thanks for the compliment! I am working on a Quadro FX 4000 and turning it into a GF 6800 Ultra right now. Besides modifying the straps for the identity, I also have to unlock an extra quad of pipelines 😉 Other cards that should be interesting are the Quadro4 980XGL and maybe the Quadro FX 1100.
Hey, did you have any luck with this? I have a 6800GS and I want to modify the straps to unlock the extra pixel pipelines and vertex shader pipelines through this method instead of using Rivatuner.
Thanks
Never finished.
Most of my cards were sold before I moved across the world and I haven't found any cheap Quadros around where I am now. If I ever get more, I'll play with this again.
As a sidenote, Rivatuner will yield the exact same result when in Windows. It may be less convenient if you plan to swap cards around and have to set nvstrap everytime, but the result is the same.
slivercrwrote on 2020-06-25, 16:33:Never finished.
Most of my cards were sold before I moved across the world and I haven't found any cheap Quadros around where I […] Show full quote
Thanks for the compliment! I am working on a Quadro FX 4000 and turning it into a GF 6800 Ultra right now. Besides modifying the straps for the identity, I also have to unlock an extra quad of pipelines 😉 Other cards that should be interesting are the Quadro4 980XGL and maybe the Quadro FX 1100.
Hey, did you have any luck with this? I have a 6800GS and I want to modify the straps to unlock the extra pixel pipelines and vertex shader pipelines through this method instead of using Rivatuner.
Thanks
Never finished.
Most of my cards were sold before I moved across the world and I haven't found any cheap Quadros around where I am now. If I ever get more, I'll play with this again.
As a sidenote, Rivatuner will yield the exact same result when in Windows. It may be less convenient if you plan to swap cards around and have to set nvstrap everytime, but the result is the same.
By the way, I bought a Quadro 3000 AGP recently -- it's defective... Corruption on the Windows loading screen and then a system freeze after a few seconds in Windows. OTOH I have three Quadro 1000 cards that work perfectly fine. I think the 5800's clock and memory were pushed beyond spec, despite what the official spec claimed (in other words, they were factory overclocked cards). If you can find a fresh Quadro 3000, I suggest running it at Quadro 1000 speeds for longevity. The 2000 and 3000 layout are also preferable because they offer a much wider variety of cooling solutions. I may try memory IC swaps on it in the future to see if that's what's wrong with it. The NV3x chips should not fail, but I wouldn't be surprised if the Samsung memory has.
As for the 6800GS.. Besides the 6800GS, I've also got 3 6800GT cards... Theoretically, the only thing that would be different in the BIOS is the device ID strap and the pixel pipes and vertex shader pipes straps. Maybe I ought to dump both cards and examine the differences... Any idea on where to look?
...
As for the 6800GS.. Besides the 6800GS, I've also got 3 6800GT cards... Theoretically, the only thing that would be different in the BIOS is the device ID strap and the pixel pipes and vertex shader pipes straps. Maybe I ought to dump both cards and examine the differences... Any idea on where to look?
bits 12, 13, 20, 21 of strap0 for device ID. No idea for the pipes.
You could compare the bits corresponding to the strap and rule out similarities/differences using the documentation to figure it out.
Depending on the GPU, some 6800 GS may not unlock at all... those using NV41 and NV42 for instance. Only if you have NV40 or NV45, there is a chance of unlocking.
bits 12, 13, 20, 21 of strap0 for device ID. No idea for the pipes.
You could compare the bits corresponding to the strap and rule out similarities/differences using the documentation to figure it out.
RivaTuner takes the guesswork out with a patch script called "NV40BIOSHwUnitsMaskEliminator.rts" which you apply to your BIOS dump. It also tells you what needs to be modified in the program if you want to do it by hand:
Depending on the GPU, some 6800 GS may not unlock at all... those using NV41 and NV42 for instance. Only if you have NV40 or NV45, there is a chance of unlocking.
All 6800GS AGP are NV40. Unfortunately, mine experiences graphical glitching and is not suitable for the mod. So not all 6800GS AGP will unlock. I think the score was around 10900 in 3DMark vs. the 6800GT.
Wikipedia's labelling is a bit confusing, but it could be extrapolated that all PCIe chips (which includes cards that use a bridge chip) are on TSMC 110nm, while the AGP parts are IBM 130nm.
Bumpgate and underfill problems used to be a hot topic with nVidia back in the day, but I now wonder if the IBM 130nm chips were in fact much more reliable than TSMC 110nm parts. I trash picked a 6600GT yesterday. Cleaned her up, oiled the fan, it works great and it's a factory overclocked model (and it's AGP with the bridge chip). So it might be interesting to compare this card's longevity with the 6800 cards here. It's not an apples to apples comparison though, because the chip is different. A good test would be to get a native PCIe 6800 card and compare that, because it's much more similar.
To be fair, all previous chipsets were TSMC, but then the problems did in fact all start with the 90nm TSMC chipsets (as in the motherboard chipsets, which had an enormous failure rate - but it's also worth noting that HP overvolted them from the factory).
EDIT: Several hours into a 3dMark 2003 loop, the 6600GT crashed the system. So this lends credence to my theory that IBM's 130nm 6th generation GeForce parts were more reliable than TSMC fabbed ICs.
But alas we cannot use this command on GF4 cards since nvflash is dumb and cannot handle straps that don't start with 7F (All of GF4 uses straps that start with FF)
nvflash --straps command will not work and you will manually have to edit your bios straps.
So hop into your fav hex editor (I used RVB Edit / X-BIOS Editor) and wrote the new mask down using endian notation.
1 FF EF FF FF 00 00 00 00 FF FF FF FF 00 00 00 00
Then save and flash your new hacked BIOS onto the card.
And since I didn't do this on my own card, but rather helped someone else do it, i'll leave it up to them to show the results 😀
Here's a quick proof
Last edited by 0xCats on 2020-08-22, 19:14. Edited 3 times in total.
There are two types of devices, those that know they've been hacked and those that don't yet know they're going to be hacked.
On the weekend a new cooler arrived and I've been trying it on. I decided to "downgrade" the card to a GeForce FX 5800 (still an upgrade from the Quadro FX 1000) in order to keep it as a single slot card and avoid heat issues with the VRAM and the RDRAM so close to each other.
IMG_20180502_140956.jpgIMG_20180502_141023.jpg
The cooler does a great job and is very silent. I suspect my success with such small coolers has to do with the CPU used in my machine, with a more powerful CPU the card would be able to stretch its legs a bit more and it would get way hotter (I see this effect now when benchmarking a Coppermine 1000 vs a Tualatin 1400)
this cooler, I think I have the exact same one somewhere, it came originally in one 8600GT DDR2 from ECS in my case, I remember it being competent at cooling that card even with OC (think it was to 610Mhz),
I'm thinking, if it can cool a "FX 5800" surely it can cool my FX5900SE? (400MHz) because that thing (from EVGA) also has a noisy fan, and the worst part is that it doesn't offer any temp readings, the fan on startup runs really loud (100%), then it goes down during the boot process and with some driver/OS it also only boosts the speed when running 3d but unfortunately on 98 with 45.23 it's stuck at 100% all the time (and aida reports 100% fan speed and something like 4500 RPM), I don't think that it's something that can be software controlled.
I wonder if you are still running this combination (quadro + cooler)?
actually a little OT but this also reminds me, around 2004 I had another 5900SE from evga (exact same model of the current one), and at some point I downloaded some 5950U modded bios for it from a forum (guru3d I think), and I remember it worked well at 475MHz (the bios bumped the vcore), and the memory I think I could run at 900 due to looser timings (impossible with the stock bios), this thread reminds me of that, but there is another detail that I find interesting but my memory is not 100% on this, with the stock bios it didn't display a temp sensor, but with the modded 5950u bios it actually worked...
edit> riva tuner can control the fan speed just fine! might be the same with the quadro, it's just that in my case, without a temp sensor it seems risky to be playing with it.
Last edited by SPBHM on 2020-08-26, 07:24. Edited 1 time in total.
It's (apparently) a OC'ed 4200 Ti. Default clocks : 275MHz Core, 600MHz memory.
Because it looks like 4800 Ti PCB, I figured it should be possible to mod it.
Using previously mentioned method, we succesfully exchanged IDs of this card 😀
Here's how it was recognised (before restart after flashing) :
The attachment Name before.png is no longer available
Performance wise, it doesn't matter if you got OC 4200 Ti "8x" to 4600 Ti speed or have actual 4800 Ti.
The attachment 3DMark 01.png is no longer available
vs.
The attachment 3DMark 01.png is no longer available
The only difference between those is DeviceID (can be seen in GPU-z).
Modding GeForce FX5800 to FX5800 Ultra (this really is only meant for those of you that want to test you golden chip 5800's)
I like to fully document the process for archival/long term reference.
Here are some test results, contrasting the performance differences between when the card is run as a Quadro, and when it is run as a GeForce.
The tests were conducted with the 56.64 drivers. 45.23 would have been preferable, but the 5700 series cards will not work with 45.23, so the 56.64 drivers were necessary in order to obtain an apples-to-apples comparison. Otherwise, besides for the purpose of benchmarking, I would avoid the FX 1100/5700 series cards altogether for this reason alone, since many older games require driver 45.23 or earlier (and in some cases, a TI4600 is preferable for its very early driver compatibility which is also required for further select games).
The benchmark software in this case was Quake 3, updated to 1.32C with all details set to max and the resolution at 1600x1200x32. The nVidia driver preferences were set to default.
Changing from Quadro to GeForce and vice versa was very conveniently done with RivaTuner 2.24.
GeForce FX 5800 - 210.6 (400/800)
Quadro FX 1000 - 193.9 (400/800)
GeForce FX 5700 - 163.5 (425/650)
Quadro FX 1100 - 135.7 (425/650)
1) The Quadro performance hit only seems to affect the GeForce FX cards and not GeForce 4.
2) The TI4600 is not nearly as fast as the GeForce 5800, which is contrary to popular belief. Rather, it sits closer to the GeForce 5700 performance-wise
3) The GeForce FX 5700LE is even slower than a previous generation GeForce4 MX440, so it should be avoided.
4) The GeForce 4 MX440 is still a largely-available, great budget choice for an old build. Performance can be further increased by dropping to 16-bit color depth (it is worth noting that this performance increase was not possible with Radeon cards because of the architecture)
1) The Quadro performance hit only seems to affect the GeForce FX cards and not GeForce 4.
2) The TI4600 is not nearly as fast as the GeForce 5800, which is contrary to popular belief. Rather, it sits closer to the GeForce 5700 performance-wise
3) The GeForce FX 5700LE is even slower than a previous generation GeForce4 MX440, so it should be avoided.
4) The GeForce 4 MX440 is still a largely-available, great budget choice for an old build. Performance can be further increased by dropping to 16-bit color depth (it is worth noting that this performance increase was not possible with Radeon cards because of the architecture)
I didn't expect #2 to be a thing, let alone popular belief.
Fully agree with you on the mx440s, they are great for P3 systems.
Also, maybe edit your post with the specs of the testbench, just so the post is useful as a reference point for other people.
A standard 5900 is very similar in performance to a Ti4600 in DirectX 8 and OpenGL titles if I remember. The only time the 5900 will grow more legs would be with AA applied. The FX5900 Ultras are a bit faster but still not a huge gap. DirectX 9 would be an entirely different story though.
The 5800/5900 has better raw performance than the TI4600, AA or no AA...
Fully agree with you on the mx440s, they are great for P3 systems.
...with the stipulation that it's a 128-bit MX440 and not a 64-bit MX440 or a MX440SE.
Also, maybe edit your post with the specs of the testbench, just so the post is useful as a reference point for other people.
The original idea was to run them on a BX/Tualatin 1.4 system, but the CPU limited the GPU and they all maxed out at 150FPS or so... High-clock Athlon XP or P4 systems would probably also do the trick, but in this case, the tests were run on an 865/E6420 (2.13Ghz) platform.
That's not to say that a 5800 or better isn't necessarily required for a Tualatin build... To the contrary - if AA and other fancy features are enabled, the card will very much be pushed to its limit, regardless of the CPU.
A standard 5900 is very similar in performance to a Ti4600 in DirectX 8 and OpenGL titles if I remember. The only time the 5900 will grow more legs would be with AA applied. The FX5900 Ultras are a bit faster but still not a huge gap. DirectX 9 would be an entirely different story though.
The 5800/5900 has better raw performance than the TI4600, AA or no AA...
Just popped into ebay to see what the prices are for Quadro vs GeForce 5800U these days.
Its amusing that literally 1 bit—the different strap needed to change identity—will make it an order of magnitude more expensive.
I was lucky enough to get one FX3000 in good conditions.
I am testing it extensively and I get around 6000point in 3dMark01 with a single P-III 700Mhz (100Mhz FSB) on a p2b-D.
Is it worth to change it to a Geforce?
Mambawrote on 2021-01-12, 16:27:I was lucky enough to get one FX3000 in good conditions.
I am testing it extensively and I get around 6000point in 3dMark01 with […] Show full quote
I was lucky enough to get one FX3000 in good conditions.
I am testing it extensively and I get around 6000point in 3dMark01 with a single P-III 700Mhz (100Mhz FSB) on a p2b-D.
Is it worth to change it to a Geforce?
Its really up to you to decide if its "worth it". I suggest you use RivaTuner to change the id, then testing games you want to play and see for yourself.
Mambawrote on 2021-01-12, 16:27:I was lucky enough to get one FX3000 in good conditions.
I am testing it extensively and I get around 6000point in 3dMark01 with […] Show full quote
I was lucky enough to get one FX3000 in good conditions.
I am testing it extensively and I get around 6000point in 3dMark01 with a single P-III 700Mhz (100Mhz FSB) on a p2b-D.
Is it worth to change it to a Geforce?
Its really up to you to decide if its "worth it". I suggest you use RivaTuner to change the id, then testing games you want to play and see for yourself.
I strongly recommend you underclock the memory of the FX3000 to at least 200Mhz below spec. The DDR in the FX3000 is notorious for not lasting long at stock clocks. This does not apply to Quadro 1000 and 2000 which used vastly improved DDR2.
I would advise doing this via a BIOS mod rather than through software because the card will run at stock clock until the software applies the changes, well after the boot process completes.