First post, by Hoping
For some time now I have been experimenting with undervolting both on GPUs and CPUs and the results have really surprised me.
The oldest case, a DX4 100Mhz at 2.7v stable that did not get hot at all to the touch.
Then an Athlon 64 3200+ (754) (1.5v default if I remember correctly) stable at 1.3v and also very cold.
A Core 2 Duo E6700 (1.3v default) at 1.2v stable and also very cold.
With the voltage drop, the dissipated power decreases and I suppose that the life expectancy of the CPU or GPU will also increase.
Now with more modern hardware, a laptop that I have with an AMD A10 7300m , that is already a low voltage CPU, has surprised me with an also important drop with what it means for the temperature, 57.4 degrees Celsius with Prime95 and Furmark at the same time and keeping the frequency the same as before the voltage drop.
A more modern example, an XFX RX 580 with a default frequency of 1366Mhz and 1150mv, I hit the sweet spot at 1080mv and 1400Mhz and a maximum temperature of 58 degrees Celsius, with a Gelic Icy Vision-A heatsink, I was not thinking in overcloking but 34Mhz is not that much....
The older the hardware, the more difficult it is to reduce the voltage since there were no such controls on most motherboards, there were only predefined voltages with jumpers or switches, if I'm not mistaken, the fine control of the voltage of the motherboards CPU started in the Socket 462/370 era and in the case of GPUs I don't know, maybe around the Nvidia 8000 series era or soon after.
Something that would seem interesting to me would be the curiosity of reducing the voltage to a Pentium 4 Presscot, but I don't have any because I have always avoided them, and also Pentium 4s in general, but hey, I have too many 462s so maybe I should get hold of more Pentium 4.
So the big question is, in what way does undervolting affect the Mosfets, is it harmful or beneficial, I think I understand that if the power consumed is less by reducing the voltage then the current must be the same with more voltage than with less voltage since P =V*I, or is this not true in this case? ; This is the part that worries me the most because my knowledge is not enough, and I cannot interpret the information I find on the internet in this regard.
Another curiosity, to what extent will undervolting extend the life of the hardware or reduce it, is it worth it? It is as laborious as overclocking, but with the advantage that it is not especially dangerous.