Merovign wrote on 2023-06-28, 03:37:
Holy noodle why does the XPS 720 have a 1000W PSU? I'm sure it's overrated, but holy noodle. I can't imagine it as sold drawing more than 500-600 watts peak with dual video cards. And, yeah, the C19 power cord it needs and doesn't have, would be funny if that's why it was discarded.
In the noughts there were a couple of things going on with PSU numbers. Firstly at the beginning, PSUs had about a third of their watts on 12V and 2/3 on 3.3/5V and motherboards were mostly pulling off the 5V... then that changed a couple of years in, but there was a lag in the PSU market, so to get a couple of hundred watts on 12V the advice was to buy 600W. Scaling this to hundred watt CPUs and GPUs in the same system with 80% efficiency, this quickly ballooned to advised and desired 1000W units.... but still 600W of it was unnecessary if it was on the 3.3/5 ... but things were swinging in 12V favor. "Experts" would still be advising to get 1000W though.
Secondly... the 2nd and 3rd tier of PSU manufacturers were quoting their power figures like lower tier stereo manufacturers quote PMPO, they were wildly optimistic, so to get a middle of the road socket A and MX440 to boot off a cheap PSU, it needed to say 500 or 600W on it, to guarantee it did as well as like an Enermax or Fortron 300W (Quality/longevity another matter.) ... as we got to the middle of the noughts, there were some of the "better" 2nd tier outfits doing the same thing, but with the justification that if they had 30A on the 12V rail, 360W, then that at least ought to be as good as an older 750W, even if there was only 150 max now on the 5V. Then that of course reinforced "needing" 750W, where it was really a 500W unit.
Eventually everyone wised up enough to start adding the watts of the outputs together and asking why they didn't and there was enough power demanding things around for liars to get a flood of RMAs if they pushed it too far. There was a tendency to overbuy the PSU "for the future" so things would be way out of warranty by the time the user found out. But when a few hundred watts was needed from the get-go with powerful CPU and GPU, they were getting found out quick. So into the 2010s PSU watt ratings actually began to make some sort of sense again....
But of course we have the opposite problem for older hardware now, powerful early noughts systems need a couple of hundred watts on the 3.3/5V and it's a third down to a fifth of a modern PSUs output, so again stupid large watt ratings needed to get the power on the rails you need it on.
Anyhoo, Dell didn't want systems sitting on the shelf with "experts" muttering about how weak their PSUs were, so for marketing reasons maybe, overspecced the PSU.
Edit: Oh I forgot... third factor... Overclockers.... this was becoming more of an influence on the performance and gaming mainstream. Late 90s it was a quiet little subgroup just doing their thing, early noughts, boom, fame, kinda, a new market at least, products appearing. Anyway, as a PSU goes over 70 or 80 percent of it's rating on a given rail, due to capacitors using up their capacitance, semiconductors starting to lose efficiency, the electrical noise goes up a bit. Now this would be inside limits at up to 100% on a quality supply... It should still be fine on a totally in spec device receiving that power. However, on a totally out of spec device receiving that power, it makes a difference between it crashing and running smooth. Therefore overclockers would want a PSU that's only at 70% load at what they are going to perceive as their maximum power demand, probably some 50% over stock. So, overclockers want darn near double real watts that anybody else wants for the same hardware, because they are going to run it at the limit. Because however, they were having more influence on gaming and performance computing at this point, gamers and performance computing ppl would also want what the overclockers want.
editII: y tho? So in conclusion to my thesis ... 🤣 ... just thought I'd drop a note about why I think overclocking suddenly popped into mainstream consciousness. IMO it was that Scientists found the FDIV bug, overclockers found the 1Ghz bug on the PIII ... HardOCP and Tom's hardware pooled resources to document it... and I think more quickly than the FDIV bug... anyway, online news about that put a lot of eyeballs on their sites.
Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.