zyzzle wrote on 2021-07-17, 03:51:
dr_st wrote on 2021-07-15, 20:23:
zyzzle wrote on 2021-07-15, 19:44:
Of course there's hibernation and sleep modes, but I'm a stickler for not liking to leave anything "on" as even these low-power modes consume power all the time, and those costs do add up, especially at my power rates of 55 cents per kilowatt hour.
You can justify your habits in any way you want, but these alleged power costs are negligible. Try to find out how much an average PC consumes in sleep mode (hibernation is like power-off), and do the math.
Sure, I know it's a 'negligible' but I know there are 8760 hours in a year, and my power costs me 56 cents a kilowatt-hour. So, even one watt running continuously will cost me $5.00 in extra electric cost per year. Multiply that by 100 "wasted", negligible watts of an 'average' household who just leaves all their appliances in standby all the time, and you realize even sipping power is costing them $500+ per year in wasted costs. Little things multiply into big costs very quickly unless watched. That's how I look at it. Good stewardship is hard to accomplish when all of your electronics gadgets never really turn off. They're that way by design (BAD design), and so I have to use power-strips to make sure my TV, microwave, etc are really OFF at night, or when I'm away from the house, etc.
Most people think that way by now, I guess. However, they all miss a flaw that's in the calculation.
Electrical devices do age quickly if constantly switched on/off.
Figuratively speaking: Because of fluctuations in temperature, material fatigue happens.
The material expands, shrinks, expands.. ICs and other filigrane elements don't like that.
Imagine a wire that's constantly bend back and forth, it eventually breaks.
(Overvoltage, which happens during on/off switching is also a problem. See below. Kills diodes, transistors etc.)
Anyway, I told people many times about it. They simply ignore the matter.
Heck, even in schools they continue to teach that nonsense about power savings.
Another bad side effect of power-savings are the peaks on the power grids.
If millions/billions of devices switch on and off simultanously, then they cause a big burden on the power grids.
There are exceptions, of course. A CPU dies at some point if it overheats, for example.
So, say, a HALT instruction is useful to allow it making a pause.
However, reducing clock speed to a permant, safe level would be best.
In this respect, old computers from the 80s/early 90s did it better.
"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel
//My video channel//