Flash really started to gain weight when Adobe took over, and it's been getting slower every few months since then. Just a year ago a fast PIII could handle 480p with "only" ~80% CPU usage. Today, that same machine can only push 16fps in 480p. 360p, while smooth, almost completely monopolizes the CPU.
Heck, I've seen brand new AMD E1 dual-core laptops (which I believe run at around 1.5GHz) struggle with Flash video in Chrome. Chrome for some bizarre reason refuses to use hardware accelerated Flash decode, no matter what GPU is installed.
Among other things, HTML5 was supposed to save us from Flash's poor performance, but its video playback appears to be just as CPU-intensive on systems without modern video cards.
Ten Gigahertz
5 Groovy GHz: Ryzen 9 5900X | GTX 1080 Ti | 64GB DDR4-3600 | 2TB NVMe, 8TB HDD | Win 10
5 Troll GHz: AMD FX-8350 | Radeon R9 Fury | 16GB DDR3-1866 | 500GB SSD, 2TB HDD | Win 8.1