Reply 160 of 179, by Anders-
Jo22 wrote on 2021-07-17, 06:04:
Most people think that way by now, I guess. However, they all miss a flaw that's in the calculation. Electrical devices do age q […]zyzzle wrote on 2021-07-17, 03:51:dr_st wrote on 2021-07-15, 20:23:
You can justify your habits in any way you want, but these alleged power costs are negligible. Try to find out how much an average PC consumes in sleep mode (hibernation is like power-off), and do the math.
Sure, I know it's a 'negligible' but I know there are 8760 hours in a year, and my power costs me 56 cents a kilowatt-hour. So, even one watt running continuously will cost me $5.00 in extra electric cost per year. Multiply that by 100 "wasted", negligible watts of an 'average' household who just leaves all their appliances in standby all the time, and you realize even sipping power is costing them $500+ per year in wasted costs. Little things multiply into big costs very quickly unless watched. That's how I look at it. Good stewardship is hard to accomplish when all of your electronics gadgets never really turn off. They're that way by design (BAD design), and so I have to use power-strips to make sure my TV, microwave, etc are really OFF at night, or when I'm away from the house, etc.
Most people think that way by now, I guess. However, they all miss a flaw that's in the calculation.
Electrical devices do age quickly if constantly switched on/off.
Figuratively speaking: Because of fluctuations in temperature, material fatigue happens.
The material expands, shrinks, expands.. ICs and other filigrane elements don't like that.
Imagine a wire that's constantly bend back and forth, it eventually breaks.
(Overvoltage, which happens during on/off switching is also a problem. See below. Kills diodes, transistors etc.)
Anyway, I told people many times about it. They simply ignore the matter.
Heck, even in schools they continue to teach that nonsense about power savings.
Another bad side effect of power-savings are the peaks on the power grids.
If millions/billions of devices switch on and off simultanously, then they cause a big burden on the power grids.
There are exceptions, of course. A CPU dies at some point if it overheats, for example.
So, say, a HALT instruction is useful to allow it making a pause.
However, reducing clock speed to a permant, safe level would be best.
In this respect, old computers from the 80s/early 90s did it better.
I understand the point with fatigue, but it might not be a big deal on the large picture...
An example would be old crt monitors without powersaving features, they get turned on and off a lot. I haven't had a single one break yet - knock on wood.
Another point being that you really don't want old electronics on while leaving home, additional risk of fire and so on.
As for the powergrids, fortunately not everyone turns on/off things at the same time.
The big industry cuts down on energy use in the evening so all the retro-users can go home and turn on their monitors 😀
Reducing clockspeed to a permanent "safe" level is bad for performance, you want to keep it running as fast as possible during the current circumstances.
Of course it introduces wear and tear, but on what scale? A cpu being worn out after just 20 years instead of 30?