VOGONS


nVidia power consumption chart?

Topic actions

Reply 20 of 25, by zyga64

User metadata
Rank Oldbie
Rank
Oldbie

On this polish site, you'll find GPU comparator which also has information about GPU TDP. It started from Geforce 4 era, and ended to GTX 6xx (last update 2012-10-02).

1) VLSI SCAMP /286@20 /4M /CL-GD5422 /CMI8330
2) i420EX /486DX33 /16M /TGUI9440 /GUS+ALS100+MT32PI
3) i430FX /K6-2@400 /64M /Rage Pro PCI /ES1370+YMF718
4) i440BX /P!!!750 /256M /MX440 /SBLive!
5) iB75 /3470s /4G /HD7750 /HDA

Reply 21 of 25, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
obobskivich wrote:

We could probably assume that chart is using TPU's furmark data, but there's no attribution or source citation, so it's impossible to say. That's another criticism for that chart - nothing is cited, no measurement procedures are explained, etc. So who knows where they stole their data from, and if whoever originally tested this stuff was doing a good job, or if the numbers are even directly comparable.

It is not they, it is just me, gathering information from the internet. I try to go for peak consumption with real world software, problem with gaming apps is that they may not capture the worst case. Furmark or virus can do more, so I tend to factor it at least a bit. Power measurements of graphics cards are by their nature only approximations. I could not be bothered to cite sources, it did not start as anything serious and now is too late. I don't want credit, it never occurred to me this could be considered theft. It is really hard to make it consistent across generations, sometimes I get some absolute values that do not match with old data, when I try to even it somewhat things can snowball from there.

obobskivich wrote:

Like I said, I wouldn't put too much faith in it - it's lazily slapped together at best, and downright inaccurate at worst. If they can't even take the time to qualify where they're getting data from or what exactly they're showing with something as well measured and documented as the 290X, how can we trust their results for something older, rarer, etc? For example they claim FX 5800 Ultra is 74W TDP, just like 6800 Ultra, but from my own testing the 6800 Ultra draws more power in the same systems under the same working conditions. I've also never seen a published review showing power measurements for 5800 Ultra, so where are they getting their data from?

I agree, don't put faith in it, it is only my effort. You are right, consumption of 6800 is higher. Here is example of 5800 Ultra test https://web.archive.org/web/20030625110846/ht … re/1109/13.html
I did not find duplicate SKUs, not 6800 GT for sure.

Reply 22 of 25, by matieo

User metadata
Rank Newbie
Rank
Newbie

Toms Hardware did a handy one too that I have been using to decide on what to use in my XP system.

http://www.tomshardware.co.uk/geforce-radeon- … ew-31495-6.html

Gigabyte 440BX, C3 Ezra 866, Voodoo 3, 512mb, 32gb SSD, YMF718+Dreamblaster S2
Toshiba Satellite 320CDT 233mhz, 64mb, 32gb SSD

Reply 23 of 25, by obobskivich

User metadata
Rank l33t
Rank
l33t
Putas wrote:

It is not they, it is just me, gathering information from the internet.

Oh I see. 😐 😊

I try to go for peak consumption with real world software, problem with gaming apps is that they may not capture the worst case. Furmark or virus can do more, so I tend to factor it at least a bit.

Furmark is not "real world software" though - it's unrealistically demanding compared to most games. Even the numbers that TPU/Anand/etc generate using Crysis or Metro or similarly demanding games aren't entirely great imho - lots of more popular games, especially relatively older games, will load the hardware significantly less, and result in much lower power draws. For example DOTA2, which is usually one of the most played games on Steam at any given time, will draw a lot less power than Crysis or Metro, to say nothing of Furmark, because it isn't nearly as heavy a load. Of course worst-case shouldn't be ignored, but it's exactly that: worst-case.

And you're very right about it being an approximation, especially with cards that adaptively clock in response to load - 290X for example doesn't have a single clock-rate, it can adjust itself between 324MHz and 1000MHz in response to its temperature, current workload, etc. And power draw moves up and down as a result.

I could not be bothered to cite sources, it did not start as anything serious and now is too late. I don't want credit, it never occurred to me this could be considered theft.

It's plagiarism; my primary issue is that it's impossible to know wherever the data came from to see how it was tested, conducted, etc. Because like you said - it gets tricky when you want to compare across generations. If it was all done as one big test on one set of relatively common hardware it would make the numbers directly comparable, whereas if it's the result of multiple tests with different methodologies it makes them harder to compare. Having no idea where any of that came from makes it even harder.

I agree, don't put faith in it, it is only my effort. You are right, consumption of 6800 is higher. Here is example of 5800 Ultra test https://web.archive.org/web/20030625110846/ht … re/1109/13.html
I did not find duplicate SKUs, not 6800 GT for sure.

That looks like the same pre-release presskit "75W" figure that a lot of other sites reported before 5800 Ultra was available. 😊 I tried browsing around but a lot of the other pages weren't archived or seemed to be missing "stuff" due to archiving. 😒 I'm not aware of any measured review that actually tested power draw on the 5800 Ultra - then again, there's only like 2 or 3 measured reviews for that card, given how briefly it was available. 😊

From my own observations it will increase system power draw over 5900XT by a little bit. TPU shows 5900XT as 35W TDP, and in my own testing (which is done at the AC outlet - knock 20-25% off due to PSU loss) system power draw with 5800U increases by up to around 30W vs 5900XT. This would put it something like 50-60W DC. I don't think I'm putting either card at 100% loading (e.g. Furmark), though. Power draw is also pretty similar to the same machine with Radeon 9800 in it, which TPU lists at 47W (for Pro) and 60W (for XT - I list both because I don't know which mine is; some utilities report it as R360 9800XT and others as R360 9800Pro). 6800 Ultra has been measured by TPU at like 74-75W under max load, and does put power consumption up another 20-30W over the 5800U, which would further support the 50-60W conclusion. But again - this is all inferential.

On the 6800GT - you're absolutely right, and I'll admit my error. I likely either mis-read 6600 as 6800, or XT as GT. 😊

matieo wrote:

Toms Hardware did a handy one too that I have been using to decide on what to use in my XP system.

http://www.tomshardware.co.uk/geforce-radeon- … ew-31495-6.html

Nice find. Anand has a similar, but shorter, article too:
http://www.anandtech.com/show/2624

Of course the cards and CPUs there are probably far too new for a "retro build." 🤣

Reply 24 of 25, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
obobskivich wrote:

Furmark is not "real world software" though - it's unrealistically demanding compared to most games. Even the numbers that TPU/Anand/etc generate using Crysis or Metro or similarly demanding games aren't entirely great imho - lots of more popular games, especially relatively older games, will load the hardware significantly less, and result in much lower power draws. For example DOTA2, which is usually one of the most played games on Steam at any given time, will draw a lot less power than Crysis or Metro, to say nothing of Furmark, because it isn't nearly as heavy a load. Of course worst-case shouldn't be ignored, but it's exactly that: worst-case.

Well, you seem to be more interested in average consumption. I will make sure to mark mine as peak values.

obobskivich wrote:

If it was all done as one big test on one set of relatively common hardware it would make the numbers directly comparable, whereas if it's the result of multiple tests with different methodologies it makes them harder to compare. Having no idea where any of that came from makes it even harder.

True, but I don't see it as fatal objection, for me it is mere complication.

obobskivich wrote:

That looks like the same pre-release presskit "75W" figure that a lot of other sites reported before 5800 Ultra was available.

I am not sure, they seem to have tests of specific models: http://www.tecchannel.de/pc_mobile/komponente … is/index53.html

obobskivich wrote:

From my own observations it will increase system power draw over 5900XT by a little bit. TPU shows 5900XT as 35W TDP, and in my own testing (which is done at the AC outlet - knock 20-25% off due to PSU loss) system power draw with 5800U increases by up to around 30W vs 5900XT. This would put it something like 50-60W DC. I don't think I'm putting either card at 100% loading (e.g. Furmark), though. Power draw is also pretty similar to the same machine with Radeon 9800 in it, which TPU lists at 47W (for Pro) and 60W (for XT - I list both because I don't know which mine is; some utilities report it as R360 9800XT and others as R360 9800Pro). 6800 Ultra has been measured by TPU at like 74-75W under max load, and does put power consumption up another 20-30W over the 5800U, which would further support the 50-60W conclusion. But again - this is all inferential.

Your numbers are sound, what are you using for testing?

Reply 25 of 25, by obobskivich

User metadata
Rank l33t
Rank
l33t
Putas wrote:

Well, you seem to be more interested in average consumption. I will make sure to mark mine as peak values.

Yes and no. I think peak values are important in terms of picking a PSU, especially if you're going to stress the system, but with more modern cards average consumption is very important because of their power management features - in some cases they actually end up being more power efficient than older, slower cards (with lower max TDPs). For example, based on its self-reporting, my 290X draws less power running Fallout 3 than the Radeon HD 4800 I originally played that game with. Even though on paper the 290X has higher TDP, utilization is a lot lower for an older game like Fallout 3. Of course there's a floor where older cards will probably still be more efficient for the same games, but it's just something to think about depending on what games one wants to play, and it seems that folks are starting to become more conscious of their gaming computer's power consumption.

True, but I don't see it as fatal objection, for me it is mere complication.

Agreed. I'm not saying don't compile the data, just that more information is always better. 😀

Your numbers are sound, what are you using for testing?

I tried the cards in two different boards with different CPUs and whatnot, and observed the same rough increases in power draw at the wall. Same PSU and power meter for comparison - it's just one of those plug-in AC meter things. So basically I'm able to see total draw including the losses due to the PSU, and making guesses at TDP based on measuring known values (like the 5900XT and 9800) against an unknown (the 5800U). Not the most accurate thing in the world, but it at least seems to generate a ballpark idea.