VOGONS


First post, by HangarAte2nds!

User metadata
Rank Newbie
Rank
Newbie

Built my first SLI rig recently with the following specs:
Core 2 Extreme X6800
Gigabyte GA-P35-DS4
4GB DDR2-667
2x Geforce 7800 GTX 256MB
128GB SSD
EVGA 80+ White 500W
The cards are identical, nVidia version. Recently cleaned and repasted.
I initially started it with just one card working and then got both running after installing the suitable driver in XP. It appears to run in SLI mode without a bridge. I didn't know that was retroactive to older cards but the driver was dated 2014. My issue is that they both run rather hot even at idle. Well, I guess displaying the desktop at 1080p in 32 bit color isn't really "idle" for this old hardware, 🤣. Playing a game that wasn't very taxing for the setup (Brothers in Arms Earned in Blood)didn't seem to raise temps much but since I wasn't able to hold my fingers to the heat spreaders on the back of the cards for more than a few seconds, I am going to guess 90-100C. I also think the fans were spinning faster though so this is probably just what the cards are regulated to. Based on specs, it would throttle at 115C (WTF??). It seems incongruous based on familiarity with modern GPUs and probably too hot. I installed a fan to supply fresh air to the blower intakes on the cards because I am running it in an open case. Eventually, it will go in a HAF X case with glass panel and a fan intake right over the cards but for now, this is my test setup.
I plan to install Afterburner tomorrow, tune and retest but if anyone can confirm this as "normal" operation, it would put my mind at ease. I would like to be able to run Crysis on this rig without it melting. BTW, in case anyone is wondering, the MB is cool as a cucumber, even with the X6800 running at a 20% overclock. There was an additional molex plug on the MB for extra PCIe power. Unplugging it had no apparent effect on power draw and thus temps and fan speeds. Does anyone know if these cards need the 6 pin power, strictly speaking? I wonder if they could run solely on the slot power with the supplemental connection??? Maybe starving 'em a bit would be the solution.

20211112_202052.jpg
Filename
20211112_202052.jpg
File size
1.45 MiB
Views
786 views
File license
Public domain

Reply 1 of 8, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

I wasn't able to hold my fingers to the heat spreaders on the back of the cards for more than a few seconds, I am going to guess 90-100C

*chuckles*

I would like to be able to run Crysis on this rig without it melting

Not possible in any combinations of GeForce 7 card. This game is just too heavy for shader anemic Nvidia DirectX9 cards.

My issue is that they both run rather hot even at idle.

GeForce 7800 has very primitive energy saving features in 2D mode so it's to be expected. And operating temperature in 3D could be quite high, reaching 80C.

Does anyone know if these cards need the 6 pin power

Yes, they consume more than 75W, which is PCIe limitation. You can underclock them though, but not undervolt them by any software, so former has limited usefulness. Latter is possible on Radeon X1800/1900 though, which allow to tweak all major card voltages in OS.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 2 of 8, by acl

User metadata
Rank Oldbie
Rank
Oldbie

Hi,

I have two GTX 7800 (256Mb) in SLI in one of my setups too. It tends to be quite hot. Even with brand new thermal paste / thermal pad (tested both) + complete cleaning + lubrificated fans.
Running F.E.A.R Ultra settings @1080p, the cards are around 90°C (Seen on GPUZ)
Interestingly, one card is heating more than the other. Even if the card are identical (Winfast 7800GTX 256). Maybe SLI is not 100% symetrical and on card takes more workload.

I'm trying to switch to GTX7800 512Mb but they are quite difficult to find... got one, will wait for another...

IMG_20210825_225146.jpg
Filename
IMG_20210825_225146.jpg
File size
1.08 MiB
Views
697 views
File license
CC-BY-4.0

"Hello, my friend. Stay awhile and listen..."
My collection (not up to date)

Reply 4 of 8, by liqmat

User metadata
Rank l33t
Rank
l33t

One of my most disappointing upgrades and least favorite NVidia series of cards was the 7000 series. I had an EVGA GeForce 6800 Ultra Extreme Edition and moved to a 7800GTX in anticipation of Oblivion. Oblivion ran just ok at medium quality and quite sluggish at max settings with the 7800 GTX (forget my other specs, but they weren't shabby). So I bought another for my first SLI setup. I can't tell you how surprised I was when I averaged about a 30% overall increase in performance. Maybe 40% if I got really lucky. For that kind of money I was kinda pissed. Oblivion ran smoother, but two 7800GTX cards were still not enough to run Oblivion silky smooth at max settings and my system was now a space heater. So yeah, your story does not surprise me. I jumped on the 8800GTX as soon as I could and never looked back.

I can always remember the game that made me upgrade. The 6800 Ultra was for Doom 3 and HL2. 7800 GTX for Oblivion.

BUT

The 8800GTX was, for the first time, in desperation to just get away from the 7800 GTX and not for a particular game. 🤣

Last edited by liqmat on 2021-12-10, 20:27. Edited 2 times in total.

Reply 5 of 8, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

GeForce 7 series overall aged like milk. Horrible driver transition with Vista and very lackluster performance in games after 2006.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 6 of 8, by acl

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote on 2021-12-02, 13:36:

Easier to grap two 7900GTX or 7950GT with 512 Mb.

Yes, but i try to have setups of specific era.
This one is strictly <= 2005.
78XX are 2005 cards
79XX are from 2006.

For the same reason, i use socket 939 instead of AM2 (2006)
I had a lot of hesitation to pick a CPU. Single or dual core ?
In 2005, as a teenager with no money i had an Athlon64 single core. Dual core was only a dream.
Vengeance is a dish served cold
In 2021 i took my revenge and picked Opteron 180 (2C, 2.4Ghz OC@2.6) a late 2005 CPU. (~800€ in 2005 -> 40-80€ in 2021)

liqmat wrote on 2021-12-02, 15:14:

I can always remember the game that made me upgrade. The 6800 Ultra was for Doom 3 and HL2. 7800GTX for Oblivion.

I played Oblivion probably more than 1k hours. But my computer was not that great. Athlon64 3000+, Radeon 9600XT.
I bougth that when i was ~16yo in 2004 and used it until around 2010. Oblivion has never been fluid for me 😁

"Hello, my friend. Stay awhile and listen..."
My collection (not up to date)

Reply 7 of 8, by liqmat

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote on 2021-12-02, 19:08:

GeForce 7 series overall aged like milk. Horrible driver transition with Vista and very lackluster performance in games after 2006.

You're right. I always call it the 7000 series. Correction: 7 series.

Reply 8 of 8, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Dual card cooling was a problem for sure. I have a feeling dual card setups benefit a lot from the blower style coolers that exhaust out the rear. As long as they have some space between them, another problem that came up.