It's no surprise that software back then didn't take advantage of graphic acceleration, FPU, MMX, and whatever other technologies until 3D GPUs became popular, and OpenGL/DirectX started to mature. So much complexity and variety of hardware to explicitly support simply wasn't practical. I remember back in the 90ies I didn't understand how consoles, with comparatively slow CPUs and tiny amounts of memory, were able to outperform PCs - I did not know most of PC software simply did not utilize a lot of hardware capabilities.
I remember working and saving the entire summer to purchase my first Pentium PC. Still a teenager, I finally bought one at a computer fair, a Pentium 166 with a 1MB Cirrus Logic graphics adaptor. The PC easily shred through older 2D games, and handled 2.5D games such as Doom and Duke3D pretty well, but struggled with some early "true 3d" games. In particular I remember getting pretty low FPS in the original Quake, NFS2, Fatal Racing. I asked around, and someone suggested I upgrade my video card to a Matrox Millenium, which was around $400 back then - huge amount for a high school student. Never the less, I saved up and purchased a 4MB Millenium. With all the fancy claims on the box, I was expecting it to be twice as fast as my old Cirrus Logic, but in games there was no noticeable difference. Windows 95 interface was a lot snappier on the Millenium, but at that time I was not particularly concerned with that. I wanted smooth frame rates in 3D games, and once I got my first 3DFX Voodoo card my jaw dropped seeing glide enabled games for the first time.
I was also recently watching a podcast, it is a bit too technical for me, but it turns out there was always 2D GUI acceleration in Windows supported to a certain degree by various graphic adaptors, but for the last 10 years or so the hardware acceleration has been dropped, and GPU drivers full render Windows GUI in software.
https://youtu.be/rdjmtFExuC4