Putas wrote:
It is not they, it is just me, gathering information from the internet.
Oh I see. 😐 😊
I try to go for peak consumption with real world software, problem with gaming apps is that they may not capture the worst case. Furmark or virus can do more, so I tend to factor it at least a bit.
Furmark is not "real world software" though - it's unrealistically demanding compared to most games. Even the numbers that TPU/Anand/etc generate using Crysis or Metro or similarly demanding games aren't entirely great imho - lots of more popular games, especially relatively older games, will load the hardware significantly less, and result in much lower power draws. For example DOTA2, which is usually one of the most played games on Steam at any given time, will draw a lot less power than Crysis or Metro, to say nothing of Furmark, because it isn't nearly as heavy a load. Of course worst-case shouldn't be ignored, but it's exactly that: worst-case.
And you're very right about it being an approximation, especially with cards that adaptively clock in response to load - 290X for example doesn't have a single clock-rate, it can adjust itself between 324MHz and 1000MHz in response to its temperature, current workload, etc. And power draw moves up and down as a result.
I could not be bothered to cite sources, it did not start as anything serious and now is too late. I don't want credit, it never occurred to me this could be considered theft.
It's plagiarism; my primary issue is that it's impossible to know wherever the data came from to see how it was tested, conducted, etc. Because like you said - it gets tricky when you want to compare across generations. If it was all done as one big test on one set of relatively common hardware it would make the numbers directly comparable, whereas if it's the result of multiple tests with different methodologies it makes them harder to compare. Having no idea where any of that came from makes it even harder.
I agree, don't put faith in it, it is only my effort. You are right, consumption of 6800 is higher. Here is example of 5800 Ultra test https://web.archive.org/web/20030625110846/ht … re/1109/13.html
I did not find duplicate SKUs, not 6800 GT for sure.
That looks like the same pre-release presskit "75W" figure that a lot of other sites reported before 5800 Ultra was available. 😊 I tried browsing around but a lot of the other pages weren't archived or seemed to be missing "stuff" due to archiving. 😒 I'm not aware of any measured review that actually tested power draw on the 5800 Ultra - then again, there's only like 2 or 3 measured reviews for that card, given how briefly it was available. 😊
From my own observations it will increase system power draw over 5900XT by a little bit. TPU shows 5900XT as 35W TDP, and in my own testing (which is done at the AC outlet - knock 20-25% off due to PSU loss) system power draw with 5800U increases by up to around 30W vs 5900XT. This would put it something like 50-60W DC. I don't think I'm putting either card at 100% loading (e.g. Furmark), though. Power draw is also pretty similar to the same machine with Radeon 9800 in it, which TPU lists at 47W (for Pro) and 60W (for XT - I list both because I don't know which mine is; some utilities report it as R360 9800XT and others as R360 9800Pro). 6800 Ultra has been measured by TPU at like 74-75W under max load, and does put power consumption up another 20-30W over the 5800U, which would further support the 50-60W conclusion. But again - this is all inferential.
On the 6800GT - you're absolutely right, and I'll admit my error. I likely either mis-read 6600 as 6800, or XT as GT. 😊
matieo wrote:Toms Hardware did a handy one too that I have been using to decide on what to use in my XP system.
http://www.tomshardware.co.uk/geforce-radeon- … ew-31495-6.html
Nice find. Anand has a similar, but shorter, article too:
http://www.anandtech.com/show/2624
Of course the cards and CPUs there are probably far too new for a "retro build." 🤣