VOGONS


The story of the QuForce FX 5800 Ultra...

Topic actions

Reply 80 of 106, by slivercr

User metadata
Rank Member
Rank
Member
agent_x007 wrote:

Can it be done to my Quadro FX 1300 ?

I'm interested in PCX 5950 😀

I'll look into it later today, pretty sure it can be done. Can you get anywhere near the PCX 5950 clocks, though?

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 81 of 106, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t

Quadro FX 1300 has standard 2.8ns for 5900XT memory.

Get up, come on get down with the sickness
Open up your hate, and let it flow into me

Reply 82 of 106, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

So a PCX 5900 with OC then 😀
I can change clocks myself (I didn't tested OC capabilities of my FX 1300 yet).

However, I assumed it shouldn't be so hard on GPU side because PCX 5950 has the same cooler.

108080818886.png

Reply 83 of 106, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t

So a PCX 5900 with OC then

Not really. GeForce PCX 5900/5950 are clocked at exactly same speed as Quadro FX 1300. I am not even sure if there were OC versions, because nobody really cared about this card.

I didn't tested OC capabilities of my FX 1300 yet

Pretty much everything you can expect from a 5900XT card. Stock cooler is kinda lacking though.

Last edited by The Serpent Rider on 2020-06-25, 15:44. Edited 1 time in total.

Get up, come on get down with the sickness
Open up your hate, and let it flow into me

Reply 84 of 106, by candle_86

User metadata
Rank l33t
Rank
l33t

One more if you could man.

It's an NV41 6800GS from an FX3450 (I have 3 of these things 🤣)

Attachments

Phenom II X4 840T @ 4ghz - ASUS M3N72D-SLI - GTX 560 Ti- 4GB DDR2 1066 - 1TB HDD - Windows XP
Pentium 4 3.4C - MSI 865PE NEO2 - x850 XT PE - 2GB DDR 400 - 500GB HDD - Windows XP
Duron 1600 - ASUS A7N8X - 512MB DDR 266 - Radeon 8500 LE

Reply 85 of 106, by slivercr

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote:

Quadro FX 1300 has standard 2.8ns for 5900XT memory.

This by itself doesn't say much. It can be severely undervolted like it was the case with my Quadro FX 1000.

The Serpent Rider wrote:
Not really. GeForce PCX 5900/5950 are clocked at exactly same speed as Quadro FX 1300. I am not even sure if there were OC versi […]
Show full quote

So a PCX 5900 with OC then

Not really. GeForce PCX 5900/5950 are clocked at exactly same speed as Quadro FX 1300. I am not even sure if there were OC versions, because nobody really cared about this cards.

I didn't tested OC capabilities of my FX 1300 yet

Pretty much everything you can expect from a 5900XT card. Stock cooler is kinda lacking though.

I don't think you are correct about the PCX 5900 and the QFX 1300 being clocked the same. You can look at the picture that was uploaded (or at the BIOS), the memory is heavily underclocked at 275 MHz, compared to 425 MHz on the PCX; and the core is not even up to PCX 5900 frequencies. They can probably both get there, but I have a gut feeling the memory won't play nice until its voltage is tweaked.

@agent_x007 @candle_86 : I'll work on the straps tomorrow, probably. Just watched Deadpool2 and now I'm watching the Champions League final, so...

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 86 of 106, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t

425 MHz on the PCX

Thats's impossible, because they're both using the same 700mhz rated memory (2.8ns). 850mhz is way out of specs.

Get up, come on get down with the sickness
Open up your hate, and let it flow into me

Reply 87 of 106, by slivercr

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote:

425 MHz on the PCX

Thats's impossible, because they're both using the same 700mhz rated memory (2.8ns). 850mhz is way out of specs.

I just checked other sources besides wikipedia, my bad: PCX 5900 and FX 1300 seem identical.

I put too much trust in this list.

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 88 of 106, by candle_86

User metadata
Rank l33t
Rank
l33t

hey man any luck on the FX3450

Phenom II X4 840T @ 4ghz - ASUS M3N72D-SLI - GTX 560 Ti- 4GB DDR2 1066 - 1TB HDD - Windows XP
Pentium 4 3.4C - MSI 865PE NEO2 - x850 XT PE - 2GB DDR 400 - 500GB HDD - Windows XP
Duron 1600 - ASUS A7N8X - 512MB DDR 266 - Radeon 8500 LE

Reply 89 of 106, by mockingbird

User metadata
Rank Member
Rank
Member

I have a question:

I did this mod on a Quadro FX1000 AGP.

I ran some 3D apps but the clock stayed at 300/300 (verified by logging with GPU-Z).

I manually went to the nVidia control panel (driver 45.23), and set the clock for 2D to 500/1000 and I immediately got artifacting and a hard lock.

1) Why won't my card ramp up to 3D clocks when 3D apps are running (tried 3DMark 2003 for example)
2) Does the artifacting and crash at 500/1000 mean my card is defective?

I am running Windows XP.

Thanks

7ivtic.png

Reply 91 of 106, by mockingbird

User metadata
Rank Member
Rank
Member
Logistics wrote on 2020-05-21, 04:58:

The review I saw had a 5800U running over AGP 8x. Is it your motherboard or the 1000 that wants to run at 4x?

Hi, thanks. I have it running at AGP 8x and SBA is enabled. I must stress that I'm trying to mod this into a 5800, non-ultra.

According to an old online article, K4N26323AE-GC1K is 1100MHz capable. The only extant datasheet I can find for this chip states the following:

K4N26323AE-GC20 - 500MHz
K4N26323AE-GC22 - 450MHz
K4N26323AE-GC25 - 400MHz

...which is 1000, 900, and 800 compared to the GC1K part on the Quadro FX1000. I'm almost certain my card isn't faulty, because I have a few of them and they all exhibit the same exact behavior. When raising the memory frequency above a certain point, it crashes the screen with artifacts.

The OP mentioned tinkering with voltages and frequencies in Nibitor. But I find the program very confusing. I'm attaching both my modded and strapped Quadro FX1000 BIOS as well as the 5800 Abit Siluro bios, and I'd appreicate it if someone could make heads or tails of the differences in them for me. Maybe the issue is that a hardware voltage mod is necessary. Again, if someone could shed some light on the matter, it would be appreciated.

EDIT: Here's another oddity. I mentioned that the clock wasn't ramping up when a 3D program was running... But I do know now that that's not driver related. For one, there are 5800 benchmarks posted on this forum with driver 45.23. Second, when I set the 2D clock to 500/800 (I'm only setting the RAM to 800 because I'm not sure how much past 800 it can go without crashing before figuring that problem out), GPU-Z will show that the clock is set successfully to the new values, but as soon as I launch a windowed 3D app, the clock drops down to 300/300, no matter what the clock setting for 3D is.

EDIT 2: Nevermind, this had to do with bypassing nVidia's test check before allowing those clocks, eventhough I had it disabled in the registry. Time to run 3Dmark 2003 again to see the new score.

EDIT 3: Something's not right. GPU-Z is logging shifts to 300/300 while 3DMark is running. I will try to modify the BIOS to 500/800 manually with nibitor. My only concern is the lack of a 1.4V option in the Quadro BIOS.

EDIT 4: So modding the BIOS to 500/800 _and_ using overclocking with coolbits kept it at 500/800 during 3DMark 2003. But I experienced instability. The system crashed during the test. My PSU is adequate and has fresh Japanese caps, but maybe it's a lousy design. So I'll try something else to rule it out, and then I'll start dropping the core speed to see if that's the problem (which would confirm that the 1.4V setting which is absent in the FX1000 BIOS is necessary for 500MHz).

Attachments

7ivtic.png

Reply 92 of 106, by slivercr

User metadata
Rank Member
Rank
Member
mockingbird wrote on 2020-05-20, 02:54:
I have a question: […]
Show full quote

I have a question:

I did this mod on a Quadro FX1000 AGP.

I ran some 3D apps but the clock stayed at 300/300 (verified by logging with GPU-Z).

I manually went to the nVidia control panel (driver 45.23), and set the clock for 2D to 500/1000 and I immediately got artifacting and a hard lock.

1) Why won't my card ramp up to 3D clocks when 3D apps are running (tried 3DMark 2003 for example)
2) Does the artifacting and crash at 500/1000 mean my card is defective?

I am running Windows XP.

Thanks

Hi! Glad to see more people playing with Quadros

1) If you followed the steps of the TLDR, then you just changed the card's identity. You still need to mod the bios with new frequencies,
2) 500/1000 are 5800u frequencies—is your card properly cooled? With my Quadro FX 1000 I actually did a small hardware mod to provide more power to the memory (there's a picture somewhere in the thread). My guess is a combination of heat and lack of power caused your card to artifact and crash.

Try some more conservative frequencies first and install better cooling. Then you can go for the Ultra 😉

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 93 of 106, by mockingbird

User metadata
Rank Member
Rank
Member
slivercr wrote on 2020-05-22, 06:38:
Hi! Glad to see more people playing with Quadros […]
Show full quote

Hi! Glad to see more people playing with Quadros

1) If you followed the steps of the TLDR, then you just changed the card's identity. You still need to mod the bios with new frequencies,
2) 500/1000 are 5800u frequencies—is your card properly cooled? With my Quadro FX 1000 I actually did a small hardware mod to provide more power to the memory (there's a picture somewhere in the thread). My guess is a combination of heat and lack of power caused your card to artifact and crash.

Try some more conservative frequencies first and install better cooling. Then you can go for the Ultra 😉

Thanks. Yes, you were correct. For some reason I erroneously thought that 5800 stock was 500/1000, when it's supposed to be 400/800.

At 400/800 the test ran perfectly fine:

3dmark03.png
Filename
3dmark03.png
File size
15.57 KiB
Views
142 views
File license
Public domain
3dmark03_2.png
Filename
3dmark03_2.png
File size
16.41 KiB
Views
142 views
File license
Public domain

The GPU-Z log reported that the clocks ran steadily at 400/800 throughout the test and that the maximum temperatures were 61 for the core and 50 for the PCB. Idle core temp is 39c at these clocks. That seems ok for the stock cooler. The test system is a Core2 E4600 @ 2.4GHz with 2GB RAM. Are my results ok?

In the picture you referenced, it shows an 8.2k resistor between TP503 and TP504, is this correct? That will bring the memory voltage up to 2.5V? That seems like a rather simple and effective mod that won't stress anything... These memory ICs are supposed to be capable of 1100MHz anyway.

Next up is the installation of my Zalman ZM80D-HP as a quiet alternative to the somewhat loud stock cooler.

7ivtic.png

Reply 94 of 106, by pentiumspeed

User metadata
Rank Oldbie
Rank
Oldbie

I have a chart from one of my googles on frequency vs ns, applies only to DDR vram not other DDR. Up to too which , I confirmed with data sheet for one of the memory chip on my video card, as I was looking into overclocking a quadro, which was guess not. Exactly what you have discovered unless some of quadro card has lower ns rating which you could get away with overclocking.

DDR Clock Real Clock Clock Period
200 MHz 100 MHz 10 ns
266 MHz 133 MHz 7.5 ns
333 MHz 166 MHz 6 ns
400 MHz 200 MHz 5 ns
533 MHz 266 MHz 3.75 ns
666 MHz 333 MHz 3 ns
800 MHz 400 MHz 2.5 ns
1,066 MHz 533 MHz 1.875 ns
1,333 MHz 666 MHz 1.5 ns
1,600 MHz 800 MHz 1.25 ns

Great Northern aka Canada.

Reply 95 of 106, by slivercr

User metadata
Rank Member
Rank
Member
mockingbird wrote on 2020-05-22, 18:02:

Thanks. Yes, you were correct. For some reason I erroneously thought that 5800 stock was 500/1000, when it's supposed to be 400/800.

Glad its working properly!

mockingbird wrote on 2020-05-22, 18:02:

In the picture you referenced, it shows an 8.2k resistor between TP503 and TP504, is this correct? That will bring the memory voltage up to 2.5V? That seems like a rather simple and effective mod that won't stress anything... These memory ICs are supposed to be capable of 1100MHz anyway.

Next up is the installation of my Zalman ZM80D-HP as a quiet alternative to the somewhat loud stock cooler.

Yeah, its a classic vmod: create a voltage divider on the feedback fooling the chip into thinking its providing less power than it should, so it sends out more power. My only real contribution was that I probed all around the card and found those awesome testing points that are super easy to solder to for the mod. Try to use 5% tolerance or less.

pentiumspeed wrote on 2020-05-23, 01:26:
I have a chart from one of my googles on frequency vs ns, applies only to DDR vram not other DDR. Up to too which , I confirme […]
Show full quote

I have a chart from one of my googles on frequency vs ns, applies only to DDR vram not other DDR. Up to too which , I confirmed with data sheet for one of the memory chip on my video card, as I was looking into overclocking a quadro, which was guess not. Exactly what you have discovered unless some of quadro card has lower ns rating which you could get away with overclocking.

DDR Clock Real Clock Clock Period
200 MHz 100 MHz 10 ns
266 MHz 133 MHz 7.5 ns
333 MHz 166 MHz 6 ns
400 MHz 200 MHz 5 ns
533 MHz 266 MHz 3.75 ns
666 MHz 333 MHz 3 ns
800 MHz 400 MHz 2.5 ns
1,066 MHz 533 MHz 1.875 ns
1,333 MHz 666 MHz 1.5 ns
1,600 MHz 800 MHz 1.25 ns

I'm sorry but I don't understand the general point you're trying to make.
The table is pretty general though, 100 MHz always equates to a period of 10 ns and vice-versa, f=1/T.

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 96 of 106, by pentiumspeed

User metadata
Rank Oldbie
Rank
Oldbie

I cross-referenced to one of the GPU DDR chip datasheet pdf and it matches what chart shows. That's UP to max clock according to datasheet. Not beyond. Manufacturers often use these memory chips with measure of margin and little more timing for reliability. Overclockers tighten up the timings and overclock them but reliability is not granteed. What I experienced trying to push stock parts matches this especially the rare occasional crashes/hiccups and data corruption. Only thing I could get that is exceptional binned memory modules and best chips (ram, CPU, high end GPU made for these and blah blah) for overclock and that cost too much for little faster.

Cheers,

Great Northern aka Canada.

Reply 97 of 106, by slivercr

User metadata
Rank Member
Rank
Member
pentiumspeed wrote on 2020-05-24, 00:41:

Manufacturers often use these memory chips with measure of margin and little more timing for reliability. Overclockers tighten up the timings and overclock them but reliability is not granteed. What I experienced trying to push stock parts matches this especially the rare occasional crashes/hiccups and data corruption. Only thing I could get that is exceptional binned memory modules and best chips (ram, CPU, high end GPU made for these and blah blah) for overclock and that cost too much for little faster.

Cheers,

Agree with everything.

The caveat here is the datasheet for the particular memory the card uses is not available anywhere (to my knowledge) so we're basing ourselves on old reviews of GF FX 5800U that use the same chips (the reviews claim the chips go up to 1100 MHz, something I doubt), datasheets for similar chips, and experimenting.

The datasheet for K4N26323AE mentions frequencies of 400 MHz (part no GC25), 450 MHz (GC22) and 500 MHz (GC20). The part number on the chip corresponds to the period: GC22 = 2,2 ns = 450 MHz. My personal theory is that these chips we deal with, part no GC1K, don't list the period but the DDR frequency: 1K = 1000 DDR = 500 MHz = 2,0 ns = GC20. At least that was my reasoning to convince myself to increase the frequency while experimenting!

To sum it up: the idea behind the mods—besides changing the card's identity—is we're not overclocking the chips, we're running them to spec!

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 98 of 106, by mockingbird

User metadata
Rank Member
Rank
Member

Just ran 3DMark03 on my 6800GT... It's a much faster card, but the 5800 is still king for DirectX 8.0 and below games because of its compatibility with the 45.23 drivers.

6800gt.png
Filename
6800gt.png
File size
15.53 KiB
Views
72 views
File license
Public domain
6800gt2.png
Filename
6800gt2.png
File size
16.52 KiB
Views
72 views
File license
Public domain

7ivtic.png

Reply 99 of 106, by mockingbird

User metadata
Rank Member
Rank
Member
slivercr wrote on 2018-04-06, 09:32:

Thanks for the compliment! I am working on a Quadro FX 4000 and turning it into a GF 6800 Ultra right now. Besides modifying the straps for the identity, I also have to unlock an extra quad of pipelines 😉 Other cards that should be interesting are the Quadro4 980XGL and maybe the Quadro FX 1100.

Hey, did you have any luck with this? I have a 6800GS and I want to modify the straps to unlock the extra pixel pipelines and vertex shader pipelines through this method instead of using Rivatuner.

Thanks

7ivtic.png