VOGONS


nVidia power consumption chart?

Topic actions

Reply 20 of 26, by zyga64

User metadata
Rank Oldbie
Rank
Oldbie

On this polish site, you'll find GPU comparator which also has information about GPU TDP. It started from Geforce 4 era, and ended to GTX 6xx (last update 2012-10-02).

Scamp: 286@20 /4M /CL-GD5422 /CMI8330
Aries: 486DX33 /16M /TGUI9440 /GUS+ALS100+MT32PI
Triton: K6-3+@400 /64M /Rage Pro PCI /ES1370+YMF718
Seattle: P!!!750 /256M /MX440 /Vibra16s+SBLive!
Panther Point: 3470s /8G /GTX750Ti /HDA

Reply 21 of 26, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
obobskivich wrote:

We could probably assume that chart is using TPU's furmark data, but there's no attribution or source citation, so it's impossible to say. That's another criticism for that chart - nothing is cited, no measurement procedures are explained, etc. So who knows where they stole their data from, and if whoever originally tested this stuff was doing a good job, or if the numbers are even directly comparable.

It is not they, it is just me, gathering information from the internet. I try to go for peak consumption with real world software, problem with gaming apps is that they may not capture the worst case. Furmark or virus can do more, so I tend to factor it at least a bit. Power measurements of graphics cards are by their nature only approximations. I could not be bothered to cite sources, it did not start as anything serious and now is too late. I don't want credit, it never occurred to me this could be considered theft. It is really hard to make it consistent across generations, sometimes I get some absolute values that do not match with old data, when I try to even it somewhat things can snowball from there.

obobskivich wrote:

Like I said, I wouldn't put too much faith in it - it's lazily slapped together at best, and downright inaccurate at worst. If they can't even take the time to qualify where they're getting data from or what exactly they're showing with something as well measured and documented as the 290X, how can we trust their results for something older, rarer, etc? For example they claim FX 5800 Ultra is 74W TDP, just like 6800 Ultra, but from my own testing the 6800 Ultra draws more power in the same systems under the same working conditions. I've also never seen a published review showing power measurements for 5800 Ultra, so where are they getting their data from?

I agree, don't put faith in it, it is only my effort. You are right, consumption of 6800 is higher. Here is example of 5800 Ultra test https://web.archive.org/web/20030625110846/ht … re/1109/13.html
I did not find duplicate SKUs, not 6800 GT for sure.

Reply 23 of 26, by obobskivich

User metadata
Rank l33t
Rank
l33t
Putas wrote:

It is not they, it is just me, gathering information from the internet.

Oh I see. 😐 😊

I try to go for peak consumption with real world software, problem with gaming apps is that they may not capture the worst case. Furmark or virus can do more, so I tend to factor it at least a bit.

Furmark is not "real world software" though - it's unrealistically demanding compared to most games. Even the numbers that TPU/Anand/etc generate using Crysis or Metro or similarly demanding games aren't entirely great imho - lots of more popular games, especially relatively older games, will load the hardware significantly less, and result in much lower power draws. For example DOTA2, which is usually one of the most played games on Steam at any given time, will draw a lot less power than Crysis or Metro, to say nothing of Furmark, because it isn't nearly as heavy a load. Of course worst-case shouldn't be ignored, but it's exactly that: worst-case.

And you're very right about it being an approximation, especially with cards that adaptively clock in response to load - 290X for example doesn't have a single clock-rate, it can adjust itself between 324MHz and 1000MHz in response to its temperature, current workload, etc. And power draw moves up and down as a result.

I could not be bothered to cite sources, it did not start as anything serious and now is too late. I don't want credit, it never occurred to me this could be considered theft.

It's plagiarism; my primary issue is that it's impossible to know wherever the data came from to see how it was tested, conducted, etc. Because like you said - it gets tricky when you want to compare across generations. If it was all done as one big test on one set of relatively common hardware it would make the numbers directly comparable, whereas if it's the result of multiple tests with different methodologies it makes them harder to compare. Having no idea where any of that came from makes it even harder.

I agree, don't put faith in it, it is only my effort. You are right, consumption of 6800 is higher. Here is example of 5800 Ultra test https://web.archive.org/web/20030625110846/ht … re/1109/13.html
I did not find duplicate SKUs, not 6800 GT for sure.

That looks like the same pre-release presskit "75W" figure that a lot of other sites reported before 5800 Ultra was available. 😊 I tried browsing around but a lot of the other pages weren't archived or seemed to be missing "stuff" due to archiving. 😒 I'm not aware of any measured review that actually tested power draw on the 5800 Ultra - then again, there's only like 2 or 3 measured reviews for that card, given how briefly it was available. 😊

From my own observations it will increase system power draw over 5900XT by a little bit. TPU shows 5900XT as 35W TDP, and in my own testing (which is done at the AC outlet - knock 20-25% off due to PSU loss) system power draw with 5800U increases by up to around 30W vs 5900XT. This would put it something like 50-60W DC. I don't think I'm putting either card at 100% loading (e.g. Furmark), though. Power draw is also pretty similar to the same machine with Radeon 9800 in it, which TPU lists at 47W (for Pro) and 60W (for XT - I list both because I don't know which mine is; some utilities report it as R360 9800XT and others as R360 9800Pro). 6800 Ultra has been measured by TPU at like 74-75W under max load, and does put power consumption up another 20-30W over the 5800U, which would further support the 50-60W conclusion. But again - this is all inferential.

On the 6800GT - you're absolutely right, and I'll admit my error. I likely either mis-read 6600 as 6800, or XT as GT. 😊

matieo wrote:

Toms Hardware did a handy one too that I have been using to decide on what to use in my XP system.

http://www.tomshardware.co.uk/geforce-radeon- … ew-31495-6.html

Nice find. Anand has a similar, but shorter, article too:
http://www.anandtech.com/show/2624

Of course the cards and CPUs there are probably far too new for a "retro build." 🤣

Reply 24 of 26, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
obobskivich wrote:

Furmark is not "real world software" though - it's unrealistically demanding compared to most games. Even the numbers that TPU/Anand/etc generate using Crysis or Metro or similarly demanding games aren't entirely great imho - lots of more popular games, especially relatively older games, will load the hardware significantly less, and result in much lower power draws. For example DOTA2, which is usually one of the most played games on Steam at any given time, will draw a lot less power than Crysis or Metro, to say nothing of Furmark, because it isn't nearly as heavy a load. Of course worst-case shouldn't be ignored, but it's exactly that: worst-case.

Well, you seem to be more interested in average consumption. I will make sure to mark mine as peak values.

obobskivich wrote:

If it was all done as one big test on one set of relatively common hardware it would make the numbers directly comparable, whereas if it's the result of multiple tests with different methodologies it makes them harder to compare. Having no idea where any of that came from makes it even harder.

True, but I don't see it as fatal objection, for me it is mere complication.

obobskivich wrote:

That looks like the same pre-release presskit "75W" figure that a lot of other sites reported before 5800 Ultra was available.

I am not sure, they seem to have tests of specific models: http://www.tecchannel.de/pc_mobile/komponente … is/index53.html

obobskivich wrote:

From my own observations it will increase system power draw over 5900XT by a little bit. TPU shows 5900XT as 35W TDP, and in my own testing (which is done at the AC outlet - knock 20-25% off due to PSU loss) system power draw with 5800U increases by up to around 30W vs 5900XT. This would put it something like 50-60W DC. I don't think I'm putting either card at 100% loading (e.g. Furmark), though. Power draw is also pretty similar to the same machine with Radeon 9800 in it, which TPU lists at 47W (for Pro) and 60W (for XT - I list both because I don't know which mine is; some utilities report it as R360 9800XT and others as R360 9800Pro). 6800 Ultra has been measured by TPU at like 74-75W under max load, and does put power consumption up another 20-30W over the 5800U, which would further support the 50-60W conclusion. But again - this is all inferential.

Your numbers are sound, what are you using for testing?

Reply 25 of 26, by obobskivich

User metadata
Rank l33t
Rank
l33t
Putas wrote:

Well, you seem to be more interested in average consumption. I will make sure to mark mine as peak values.

Yes and no. I think peak values are important in terms of picking a PSU, especially if you're going to stress the system, but with more modern cards average consumption is very important because of their power management features - in some cases they actually end up being more power efficient than older, slower cards (with lower max TDPs). For example, based on its self-reporting, my 290X draws less power running Fallout 3 than the Radeon HD 4800 I originally played that game with. Even though on paper the 290X has higher TDP, utilization is a lot lower for an older game like Fallout 3. Of course there's a floor where older cards will probably still be more efficient for the same games, but it's just something to think about depending on what games one wants to play, and it seems that folks are starting to become more conscious of their gaming computer's power consumption.

True, but I don't see it as fatal objection, for me it is mere complication.

Agreed. I'm not saying don't compile the data, just that more information is always better. 😀

Your numbers are sound, what are you using for testing?

I tried the cards in two different boards with different CPUs and whatnot, and observed the same rough increases in power draw at the wall. Same PSU and power meter for comparison - it's just one of those plug-in AC meter things. So basically I'm able to see total draw including the losses due to the PSU, and making guesses at TDP based on measuring known values (like the 5900XT and 9800) against an unknown (the 5800U). Not the most accurate thing in the world, but it at least seems to generate a ballpark idea.

Reply 26 of 26, by Angel984

User metadata
Rank Newbie
Rank
Newbie
mirh wrote on 2015-01-30, 21:42:

Wikipedia starts to list TDP only since 8xxx series..

But it wouldn't be bad if while you are at it you could note down your findings 😈

Yes Started but no full
https://en.wikipedia.org/wiki/List_of_Nvidia_ … s:_Desktop_GPUs

And some is would be very insteresting
For example:
TNT2(M64, base, pro, ultra) no info ?w
GeForce1 TSMC 220nm 120MHz TDP 12-13W Is it just the GPU bga power consuption or full card ? somewhere write 18W

GeForce2 GTS TSMC 180nm 166MHz TDP 6W Really? just 6W like an TNT2? or GeForce2 MX400
But no have data for others
GeForce2 Pro TSMC 180nm 200MHz TDP ?
GeForce2 Ultra TSMC 180nm 250MHz TDP?
and GeForce2 Ti TSMC 150nm 250MHz TDP? It should be lower than gts? or same like than Pro

Same like missing
GeForce3 Ti200 TSMC 150nm 175MHz TDP? maybe under 20W?
GeForce3 Ti TSMC 150nm 200MHz TDP?
GeForce3 Ti TSMC 150nm 240MHz TDP 29W

GeForce FX 5200 TSMC 150nm NV34 200Mhz 21W
but
GeForce FX 5200 TSMC 150nm NV34 200Mhz with slow memory ?W ~18?
GeForce FX 5500 TSMC 150nm NV34B 270Mhz ? W
GeForce FX 5600 XT TSMC 130nm NV31 very low 235Mhz (normal 5600 is 325Mhz) ~ same Mtexels than 5200 and that is 21W on 150nm

GeForce FX 5700 VE/LE TSMC 130nm NV31 very low 250Mhz 20W but normal 5700 is 420Mhz same 20W 130nm (more effcient than 5200 on 150nm)
So i don't think 5700 VE 250MHz 20/21W not should be same then 5700 425Mhz 20W

GeForce 6200 LE NV44 350Mhz TSMC 110nm half core 2:1:2:1 ?W
GeForce 6200A (agp) NV44A 350MHz 64bit TSMC 110nm 4:3:4:2 ?W
But same like
GeForce 6200 TurboCache NV44 350Mhz 64bitTSMC 110nm it is 25W

GeForce 6200 (agp) NV43 4:3:4:4 (like 6600 128bit) TSMC 110nm 20W
GeForce 6600 LE NV43 TSMC 110nm half core 4:3:4:4 and like an 6200 speed should be same 20w?
GeForce 6600A (agp) NV43 TSMC 110nm 26W

And
GeForce 7100 GS TSMC 90nm 350Mhz 13W
GeForce 7300 GS G72 TSMC 90nm 550Mhz just 10W same strange

The reason why am I ask this...
I would like to insert something better card into Aopen MX3L 440LX board AGP1.0 2x speed and as i read has something limit AGP 3.3V with 6A It means 19.8W
How much the 5V rail?
other 4X-8X later boards why not limited with this 6A problem?
440LX board can be works work external power connector rards

I checked some card with Win 98Se ( 3D mark99 2000, Q3 1.32)
Tnt2 M64 125/150 OK
Tnt2 ultra 150/183 OK ( something cheap Pro card PCB with high clocks "maybe ultra" 😁)
GF2 MX200 64bit ok 1W? (i don't think ...the card's power circuit very .hot maybe 4-6W minimum)
GF4 440 NV17 128bit OK 18W
FX5200 NV34 128bit OK 21W very top limit~~ (some graphic issue, maybe just driver problem)

Radeon 9200SE 200MHz core RV280 64bit ok (not know ??W) 150nm
Radeon 9250 240Mhz core RV280 128bit ok (not know ??W) 150nm
Connect 3D Radeon 9550SE 240Mhz core RV350 64Bit with 3,3V socket (rare type) not start, just beep codes (not know ??W) but higher clock same 9600 just 17W

Will be check in near future
5700 128bit 20w??
9200 128bit 250Mhz core
Connect 3D Radeon 9600 128Bit with 3,3V socket (rare type) (maybe will not work)
And would be good 6200AGP 64bit or 128bit or 7300GS agp or 7300GT with external power or GF4 MX460 or GF2 Ti

Aopen MX3L+ 533MHz Mendocino+ Gf4 MX440 128MB 128Bit / FX5200 128MB 128Bit / Radeon 9250 128Bit
Asus Tusl2 (Non-C full Agp pro) + 1.4GHz PIII-S SL6BY + 7600GS
(other Sparkle FX5700, Aopen GF4 Ti4200)
Samsung 796MB (Y2006)