I really don’t see the point in this. First, the amount of undervolting you can achieve without sacrificing performance is meaningless for the lifespan. Second, these used cards have most of their hours already behind them from the era when they were purchased and originally used. The hours they get in retro use are often insignificant, especially with those who have several different systems. Third, graphics chip degradation is only one failure mode and it is IMO one of the least likely. Capacitors fail, memory chips are prone to failure, BGA solder balls break either under the graphics chip or memory etc.
While theoretically decreasing the graphics chip temp you decrease the severity of he heat cycle for BGA joints, it is pretty much meaningless. Like I said, these cards have most of their power cycles already behind them. Shaving couple of degrees off from the temps is also not going to change anything. If those joints are going to crack, the process has already started.
Like others have said, the best preventive thing you can make is to ensure that cards have good airflow and if necessary, improve cooling. Some cards such as radeon 9700 and 9800 have atrocious heatsinks and fans, so if you happen to have a working card, it would be a good idea to change the original cooler to something more effective.