dr_st wrote:If you're building general-purpose software for PCs, then none of this matters.
What is 'general-purpose'?
I can think of a number of very common applications, where performance is very important.
One obvious example is games: The main selling point of many AAA titles is how much eyecandy you get, and how well it runs on the average person's system.
Games like Q3A, Half-Life2 and DOOM 2016 are perfect examples of games that are highly optimized to get 60 fps on a wide range of systems, so people can get a very good online experience.
Developing game engines and shaders is all about the optimization. Game developers compete directly with eachother on eyecandy and framerate.
In general, if you develop middleware (libraries and development kits), performance is a very important aspect.
Another example is movie editing software for example. Especially for high resolutions, such as 4k, you want highly optimized post-processing filters and encoding/decoding, because with unoptimized software, it will take forever to create movies even on the most high-end systems you can buy.
There's probably plenty of other examples (VST? The faster they are, the more effects and instruments you can use, without having to mix them down in between).
OSes themselves are obviously also highly optimized. As are many drivers, especially for video cards. Performance is everything.
On the flipside this also means that the average programmer who uses a modern OS, and a modern programming framework, will make use of a lot of highly optimized libraries and modules.
In the old days people had to develop many datastructures and algorithms themselves, such as hashtables, binary search trees etc. These days they are included, either in the libraries, or even as a fundamental part of the language itself.
I would say there just is a lot more well-optimized code around these days than there was in the past.
Of course that doesn't prevent some developers from being completely incompetent, and making terrible software.