The way a game is programmed factors heavily into it as well and with a lot of modern games using other people's engines, some of which can be pretty old by this point, they may not be optimized or written to properly handle some of the optimization settings available with graphics drivers.
Pre-rendered frames is one of the biggest ones. When it was introduced a little over a decade ago, virtually nothing could handle it properly, not even benchmarking software! I remember testing out one of the 3DMark programs at the time (the one with the whole Matrix-inspired lobby sequence) and noticing in the tests that if pre-rendered frames were on it would just render the same frame multiple times and end up killing the framerate visually, even though it was REPORTING a higher rate because it really was rendering multiple frames! :O
You do have to be careful though. Some modern games heavily rely on pre-rendered frames to keep the framerate going because of how they do their processing and rendering. Poor framerates can sometimes be solved simply by adding an additional pre-rendered frame... but then you've got the whole lag thing to worry about so it becomes trying to find a balance between framerate and input lag. :P
Threaded Optimization is another setting that can cause weird issues but only with extremely few games. The most this setting usually does wrong is dramatically increase CPU usage for games which normally use very little. :P
Anisotropic Filtering usually makes things look really good, at least in 3D, but it can cause strange artifacts in some 2D games and can make thin lines on textures look blurry. Doesn't matter for most 2D games but is something to watch out for.
One thing that can dramatically change performance on all cards is the "Texture Filtering Quality" setting. This setting has been around for a long time and essentially the more you set for performance over quality, the better your framerate will be but textures will be blurrier at closer distances. The funny thing is, this setting alone can create a MASSIVE difference in framerate. Nowadays, the high quality setting usually is still a decent enough FPS, but in the early 2000s this setting could mean the difference between 60 FPS and 10 FPS! :O
And of course, there's the setting this thread was started over: Power Management. As I said earlier, most games handle this well enough. The challenge is when games occasionally switch between graphical situations which require tons of power versus situations which require very little, as this confuses the GPU and may cause heavily decreased framerates when returning to complex rendering from simple rendering until it finally clues in and realizes that power is needed and ramps it back up. :P
I've also noticed, being on a Windows 8.1 system now, that the further back you go in your compatibility mode setting, the worse the framerates get. With any games using the Unity engine, I have to set compatibility to Windows 7, otherwise I get extremely strange visual stuttering or framerate drops, but this in turn decreases the maximum performance I can expect slightly. Setting for Windows XP drops the maximum performance I can expect dramatically, though of the extremely few games I've run into with issues, setting to Windows XP compatibility doesn't fix them anyways. :P
--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg