I'd venture to say that the people that need or want it are us.
FTFY.
I suppose that there may also be some industrial control systems that need 16 bit compatibility, but they really shouldn't be running a modern, 64 bit OS anyway (Not having pre-emptive multitasking is a plus for that sort of thing).
On topic: Basically the only force pushing for faster hardware in the consumer space is the Internet, since the typical modern site is a bloated, JS-ridden POS. If more pages were static/small/easy to render, then I'd likely just stick with my dual Katmai build and call it a day.
The sad fact of the matter is that the most commonly used non-internet software hasn't actually advanced in a decade or more. If you gave me Office 95 and Office 2013, I couldn't find a functional difference between the two except for that the new version dumped menus and LIKES TO USE CAPS A LOT. There's also barely any functional difference between Windows 2000 and Windows 8. They are both compatible with the same general API, both can support multiprocessor systems, both offer pre-emptive multitasking and memory protection, both provide support for hardware acceleration in games, both provide a TCP/IP stack, and both can run on the same hardware (except for the artificial restriction that Windows 8 places on instruction sets, but let's not get into that).
The average person hit "good enough" in 1989 with the 486. Virtually every change since then has been "Previous version but bigger and maybe with a new UI" on the software side or "Wider, deeper, and smaller" on the hardware side.
However, when we consider the high performance, server, scientific computing, and arguably gaming side of things, we will never approach "enough." As soon as GPGPUs advance a generation, some researcher can find a way of applying the additional FLOPS to making things better. In fact, neural networks basically died from the 1980s until fairly recently because we didn't have the computing power to see them come to fruition properly. Now we do, and they are awesome! Servers can always use more cores by the nature of most server software. Gamers can always use higher resolutions and frame rates. Look at the move to 4k, a resolution that can still cripple today's most expensive graphics cards. With VR and other such technologies, we can readily take advantage of even more computing muscle to emulate a virtual world. There's also the fact that game AIs can always get better, but I'll just leave it at that...
Dual Katmai Pentium III (450 and 600MHz), 512ish MB RAM, 40 GB HDD, ATI Rage 128 | K6-2 400MHz / Pentium MMX 166, 80MB RAM, ~2GB Quantum Bigfoot, Awful integrated S3 graphics.