VOGONS


First post, by rick6

User metadata
Rank Member
Rank
Member

I almost dare to know what most would awnser but i'll do it anyway:

Would sawing a "new" notch on this Geforce 6800 (dual molex) Asus V9999 in order to fit it in AGP 3.3V kill it when i turn the pc on?

b1796.jpg

I know this question sounds pretty dumb but i believe there were some versions of the nvidia 68xx generations that were AGP 3.3v compatible, and since this card doesnt have any copper contacts i almost feel like trying it.
I also know that the notches are a safety measure for avoiding burning your video card or motherboard or both in case of voltage incompatibily but maybe this could work somehow.

Any ideias?

My 2001 gaming beast in all it's "Pentium 4 Williamate" Glory!

Reply 3 of 23, by rick6

User metadata
Rank Member
Rank
Member

Yeah i understand, chances of this working are close to none and the magic smoke could be released. It's not worthy to risk and damage this card as it is one nice agp card and these are getting rarer (so i think).

I just thought it could work because i believe i already saw a few geforce 68xx's that were agp 3.3v compatible and there could be a slim chance of this working. There aren't even contacts in that notch area!
There was a hardware modder named ciacara that was able to mod a voodoo 5 agp to work in agp 1.5v by adding a new notch but yeah, that's sounds safer than trying a 1.5v card on a 3.3v agp slot.

My 2001 gaming beast in all it's "Pentium 4 Williamate" Glory!

Reply 4 of 23, by Old Thrashbarg

User metadata
Rank Oldbie
Rank
Oldbie

I just thought it could work because i believe i already saw a few geforce 68xx's that were agp 3.3v compatible

I've heard that there were 3.3V compatible 6800 cards, but the more I've looked into it, the less I believe it. Adding to the confusion (or perhaps the actual source of it) is the fact that a couple manufacturers released cards that were keyed wrong... they had the notches to fit in 3.3V slots, but they were not actually 3.3V compatible.

Reply 6 of 23, by nforce4max

User metadata
Rank l33t
Rank
l33t

It is not worth the risk of blue smoking this one and the 6800 series wasn't exactly known as 3.3v compatible. Some of the 6200s and 6600s are 3.3v compatible but they are getting harder to find and don't have the greatest specs. Only look for a sure thing before plugging any card into a 3.3v slot until you have found someone who has had some success in having used the card without failure. It might be cheap to replace the card and the board now but in the years to come it will be very expensive and one does not want that on their mind about having lost something so rare.

On a far away planet reading your posts in the year 10,191.

Reply 7 of 23, by j^aws

User metadata
Rank Oldbie
Rank
Oldbie
rick6 wrote:
I almost dare to know what most would awnser but i'll do it anyway: […]
Show full quote

I almost dare to know what most would awnser but i'll do it anyway:

Would sawing a "new" notch on this Geforce 6800 (dual molex) Asus V9999 in order to fit it in AGP 3.3V kill it when i turn the pc on?

b1796.jpg

I know this question sounds pretty dumb but i believe there were some versions of the nvidia 68xx generations that were AGP 3.3v compatible, and since this card doesnt have any copper contacts i almost feel like trying it.
I also know that the notches are a safety measure for avoiding burning your video card or motherboard or both in case of voltage incompatibily but maybe this could work somehow.

Any ideias?

Sawing the notches works. I have done this to GeForce 6800GT and GE variants, and installed them both in a 440BX mobo - currently have the GT installed, alongside a V5500 PCI. You need to be careful with the saw - I just used a fine-grade file. A mobo BIOS update maybe required to install Win98SE drivers ( I needed to). YMMV.

Reply 8 of 23, by rick6

User metadata
Rank Member
Rank
Member

Hi j^aws, those are really good news, but still i'm a bit afraid as this is a somewhat different variant of the nvidia 6800 cards. As you can see it is a dual molex card but it isn't (at least in the bios) a 6800 ultra card. Not sure if this would even be the same pcb as the 6800 ultra, but do you think this would work with this particular card? I wouldn't blame anyone of course but i would be mad at myself if it did kill the card 😮

My 2001 gaming beast in all it's "Pentium 4 Williamate" Glory!

Reply 9 of 23, by j^aws

User metadata
Rank Oldbie
Rank
Oldbie
rick6 wrote:

Hi j^aws, those are really good news, but still i'm a bit afraid as this is a somewhat different variant of the nvidia 6800 cards. As you can see it is a dual molex card but it isn't (at least in the bios) a 6800 ultra card. Not sure if this would even be the same pcb as the 6800 ultra, but do you think this would work with this particular card? I wouldn't blame anyone of course but i would be mad at myself if it did kill the card 😮

Well, I just checked both my 6800GT and 6800GE models, and they both have dual molex connectors, both are Asus cards and look identical to the pictured card above. Both cards are designed to be overclocked to Ultra speeds. The GE variant has 5 VS and 12 PS (shaders can be unlocked), whilst the GT variant has 6 VS and 16 PS (same as Ultra). It looks like these cards share the same PCB and cooling system.

As they say: "Don't try this at home, unless you know what you're doing, and you do so at your own risk..."

If it helps, I wasn't sure if the mod was going to work at first, but I was prepared to lose the mobo and card, and when it worked, it was a nice bonus!

Reply 11 of 23, by rick6

User metadata
Rank Member
Rank
Member

Ok i'm going try this probably this weekend, and i'll keep you guys updated. I might even tape it in case of fireworks 😉

My 2001 gaming beast in all it's "Pentium 4 Williamate" Glory!

Reply 13 of 23, by rick6

User metadata
Rank Member
Rank
Member
gerwin wrote:

I modded an Asus 6200 by making an extra nodge, it did not work again after doing that.

Could you have sawed it too far in?

My 2001 gaming beast in all it's "Pentium 4 Williamate" Glory!

Reply 14 of 23, by rick6

User metadata
Rank Member
Rank
Member
j^aws wrote:

Well, I just checked both my 6800GT and 6800GE models, and they both have dual molex connectors, both are Asus cards and look identical to the pictured card above. Both cards are designed to be overclocked to Ultra speeds. The GE variant has 5 VS and 12 PS (shaders can be unlocked), whilst the GT variant has 6 VS and 16 PS (same as Ultra). It looks like these cards share the same PCB and cooling system.

I'm sorry but just to make sure, did your cards also had the missing gold plated contacts like the one i showed above where it should be a notch?

My 2001 gaming beast in all it's "Pentium 4 Williamate" Glory!

Reply 15 of 23, by j^aws

User metadata
Rank Oldbie
Rank
Oldbie
rick6 wrote:
j^aws wrote:

Well, I just checked both my 6800GT and 6800GE models, and they both have dual molex connectors, both are Asus cards and look identical to the pictured card above. Both cards are designed to be overclocked to Ultra speeds. The GE variant has 5 VS and 12 PS (shaders can be unlocked), whilst the GT variant has 6 VS and 16 PS (same as Ultra). It looks like these cards share the same PCB and cooling system.

I'm sorry but just to make sure, did your cards also had the missing gold plated contacts like the one i showed above where it should be a notch?

Yep, exactly like the picture. Both my cards have the same blue PCB and cooler.

Reply 16 of 23, by gerwin

User metadata
Rank l33t
Rank
l33t
rick6 wrote:

Could you have sawed it too far in?

There was a trace nearby, but I don't think it got damaged. I remember trace conductivity measured out to be OK, but still..

(I then obtained a universal AGP MSI NX6200AX card that worked fine. But since it was similar in performance to the previous MX440 128-bit, I lost interest quickly.)

--> ISA Soundcard Overview // Doom MBF 2.04 // SetMul

Reply 19 of 23, by j^aws

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:

If the chip supports 2x, I don't understand why there weren't more universal AGP cards....

Yeah, they could've made all these cards universal by default. But by that time, CPUs on AGP 2x only mobos were too limiting (not considering Tualatin upgrades), and PCI-e was taking off; not many bothered...