You can avoid any trouble by pairing the power hungry AXP's with specific boards that put their mosfets on the 12V rail, whenever that's an option. (Abit NF7, Epox 8RDA3+, Gigabyte 7NNXP, etc.)
AGP slot powered cards tend to lean on 3.3V quite a bit. They could theoretically hit 5V for 10W tops; it's unlikely that they got too close. Performance cards with molex power look to peak at about 20W-25W on 5V.
This X-bit Labs table on the GeForce FX 5900 Ultra reference board shows it's already nice and 12V heavy, like later cards:

Radeon 9700/9800 and some of the other GFFX board designs are most troublesome, breaking 25W from 5V.
The attachment 9800pro.gif is no longer available
The attachment fx5950u_table-b.gif is no longer available
(That 5950 Ultra will give AGP slots a nice workout. It comes close and, after overclocking, actually breaks the 6A specification for the slot's 3.3V current supply.)
Athlon XP's can do up to 80W-90W of damage including VRM efficiency losses, so with a video card that's easily 20A right there. The rest of the system (PCI cards and drives) needs a couple more amps, and overclocking will add another couple more.
Good to keep an eye on those 3.3V rails also for cases where they might directly bleed off an already marginal 5V supply, like this guy:
The attachment fsp400-60gen.jpg is no longer available
Any less than 150W of combined power is a definite issue for the worst case combos.
Regardless, some (many?) motherboards are designed to generate 3.3V from the 5V rail, which sure doesn't help if it's the case.
I've been educating myself a little on this subject today. 😀