Games were always more or less designed to take advantage of the CPU and GPU power available at the time and maybe looking ahead a couple years, and the latter started around the time that 3D accelerators became a thing at all. 3D benchmark apps were usually (especially in the case of 3DMark) designed not really to showcase what a particular card could do but how an overall system would perform with present-day, or near-future-day, games. So, pretty much the same thing.
Every system is different and every game too. Some games may be CPU-constrained on your system, some GPU-constrained. That's as true now as it was 20 years ago. Developers can't predict how absolutely everybody is going to configure their systems at any given time, but the wild card is the CPU and GPU manufacturers themselves... who often release hardware advertised as "current" but that's really last-gen hardware at low cost. The FX 5200 was one such GPU; it was worse than the GeForce 4 MX series (already a budget card series in the GeForce 4 line more in line with the GeForce 3) even at the time of its release. This card was really intended for OEMs to put in systems so they could advertise having an "FX" card. But it was in no way comparable to the "real" FX cards like the 5800 and 5900. There's not really even a modern-day equivalent; Nvidia does still make a lot of lower-end graphics cards today but they don't put them in the current "RTX 40" series. They put them in a lower series (30 series, 16 series, etc.), which was an option I guess they hadn't thought of in the FX era.
So it's not surprising that a better FX card will be CPU-constrained in some games and/or benchmarks, even while an FX 5200 is clearly GPU-constrained in those same games/benchmarks (assuming the same CPU). The software was designed around what was assumed to be an average performance CPU/GPU combo at the time, but the balance can obviously be off in one way or another.
If what you're saying is that you shouldn't assume all FX cards are equal, I do agree with that. There was a period there when you really could not assume even current-gen performance from current-gen Nvidia products unless you knew your model numbers. Some would argue that's also true of the 4060 today, but as a 4060 owner *and* an owner of older Nvidia cards from when things were really, very different, I disagree 😀