Reply 40 of 49, by SirNickity
48V requires a DC-DC converter that is compromising on range vs. efficiency. Nothing is free. Regulating 48V to 1.0V or less for high-speed digital components (RAM, CPU, etc.) is an inefficient process compared to 5v->1V or even 12V->1V.
The reason it might make sense for server applications is because of high current demand. You have to move 12V at high amperage around somehow, and that requires thicker wire and thick pours of copper. The losses from compromising on conductor area can outweigh the conversion losses. That's not the case for single-CPU systems that run idle most of the time.
It makes sense for PoE applications due to line length. Higher voltage, lower current -- again, less loss over lo-o-o-o-o-ong runs of cable. OK for 15W, even for 30W. Now we're starting to talk about 90W per port (!) which is just ridiculous. Before long, people will be stringing their old incandescent lighting systems off of an Ethernet switch. I digress..
12V to a motherboard is fine. Asking peripheral manufacturers to start transitioning to single-rail power for drives would be fine. (Heck, drives are mostly an obsolete phenomenon anyway except for increasingly niche applications, like NAS, servers, etc.) Doing bulk 12V-to-5v conversion on a motherboard is just a dumb proposition*, except for tightly integrated systems where the load is minimal and known in advance. For one, it's just moving the problem from a device that is capable of doing the conversion in the most opportune way, to a device that is already crammed full of stuff and has the least flexibility in form factor to handle high-power conversion. I really hope that's not where this standard is headed, because it would just be asinine.
(* I want to be clear: By this, I mean it would be a dumb proposition to transition the standard in this way. I do not mean to imply commentary here was dumb for mentioning it as a potential interpretation of the news.)
I'm not really sure whether I expect Intel to know better at this point or not though. I really think they truly view the NUC as the next unit of computing, and "everything should look like this now." IMO, the industry giants (Intel, MS, and Apple) are infected with a tragic case of tunnel vision. It's maybe forgivable when your market is a smaller demographic of computing, but Intel and Microsoft especially have a greater obligation on account of their market share and so their actions reflect on a much larger cross-section of users.