VOGONS

Common searches


Geforce power managment mode

Topic actions

First post, by m1so

User metadata
Rank Member
Rank
Member

If I set for example Steam to prefer maximum performance, will it run every application with maximum GPU power as long as Steam is run, including non-steam games? I don't want my PC to be loud and hot all the time but I also don't want to set power profiles for every game.

Reply 1 of 30, by Gemini000

User metadata
Rank l33t
Rank
l33t

That will not work, but you can just set a default setting of maximum performance for all 3D applications as this will not affect when the card is not doing 3D stuff. Also, if you both force vertical sync on and enable triple buffering this will further prevent power from being used when it isn't necessary by idling the GPU between frames when it's done rendering, but still waiting to display the next frame because it's running faster than the display can physically handle. This won't affect performance although any game which can't maintain a perfect 60 FPS may drop to 30 FPS instead of settling somewhere between those two extremes.

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 2 of 30, by m1so

User metadata
Rank Member
Rank
Member

Nvidia does do adaptive vsync, which turn off vsync when it is not maintaining a refresh rate fps. What I am concerned about is the desktop power usage, my GTX 660 stays at full clock when I turn on "Prefer maximum performance", how do I set it that only 3D applications go full clock?

Reply 3 of 30, by obobskivich

User metadata
Rank l33t
Rank
l33t
m1so wrote:

Nvidia does do adaptive vsync, which turn off vsync when it is not maintaining a refresh rate fps. What I am concerned about is the desktop power usage, my GTX 660 stays at full clock when I turn on "Prefer maximum performance", how do I set it that only 3D applications go full clock?

There's no need to have it force "full clock" even in 3D applications (because if it needs to do that, it will do that automatically anyways). Just let GPU Boost do what it does; if you haven't updated your drivers in the last 6-8 months I'd suggest that too. They broke advanced shadows in The Sims 2 after 320.something, but it doesn't affect the game (just turn the shadows setting down to one from the top and it works like it should - honestly I can't tell much, if any, difference). In the 33x.x that I have installed the clocking is much more granular than it was last spring, which has resulted in lower temperatures in some applications (if you watch the clocks plotted in GPU-Z it looks more like a "wave" than "stair steps" - my Fermi still has "stair steps" but it seems to adjust faster).

Adaptive vsync, in my experience, results in both intermittent screen tearing and choppy performance in some games (Skyrim being the worst, but I've seen it in Half-Life 2, Empire: Total War, and Fallout: New Vegas as well). It's the worst of both worlds. 😵

If the card isn't running at full or boost clocks in whatever game, it's because it has determined it isn't necessary - there's no point in having it overdraw and run at 900 FPS. Vsync will, as Gemini000 pointed out, knock that down further, because the card is only obligated to 60 (or 75, or whatever) FPS. Some games this actually improves stability/performance too (some games don't like running at 100s-1000s of FPS).

As far as setting it, you'd have to do it per-application profile; you can't just globally set Steam or Origin or whatever. But again, there's no point to doing this. Let the power management do its job; lots of games (especially if you aren't running at extremely high resolutions) don't need full clock to run extremely well (my GTX 660 runs Skyrim on Ultra at DC2k and rarely ever goes above 900MHz, let alone full boost at 1.13GHz; the only thing that will consistently run it up there are Uniengine demos).

Reply 4 of 30, by Gemini000

User metadata
Rank l33t
Rank
l33t

The only trick I've noticed with leaving the power management system on and not going for maximum performance is that some games don't properly trip the flip into maximum performance mode until after a minute or so of gameplay, so when you initially start playing the framerate isn't anywhere near as good as it should be, and then all of a sudden the framerate just skyrockets and stays there... and then if there's a drop in needed power you have to go through this AGAIN.

For most games leaving power management set to Adaptive is OK, since most games typically need a certain level of GPU power and don't fluctuate, but any game which has frequent and drastic changes in needed GPU power will have erratic framerates.

I find it odd though that your GPU goes full blast all the time if you set to prefer maximum performance in all applications. These settings are only supposed to affect 3D programs, so if no 3D programs are running the GPU shouldn't be burning up more power. (And I can assure you, I have this set this way and my GPU is not running full blast all the time.) Now, something I should quickly point out: The nVidia control panel ITSELF is a 3D program, so to see the proper clocking effects of setting prefer maximum performance you need to close out the nVidia control panel entirely and make sure no other programs are running which require hardware accelerated graphics.

If you're still having the GPU stay ramped up and burning power then yeah, set it to adaptive globally and any time you run into a game where the framerate isn't perfect just go into the nVidia control panel, click into the recent programs list, select your game from the top and set it to prefer maximum performance.

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 5 of 30, by obobskivich

User metadata
Rank l33t
Rank
l33t
Gemini000 wrote:

The only trick I've noticed with leaving the power management system on and not going for maximum performance is that some games don't properly trip the flip into maximum performance mode until after a minute or so of gameplay, so when you initially start playing the framerate isn't anywhere near as good as it should be, and then all of a sudden the framerate just skyrockets and stays there... and then if there's a drop in needed power you have to go through this AGAIN.

For most games leaving power management set to Adaptive is OK, since most games typically need a certain level of GPU power and don't fluctuate, but any game which has frequent and drastic changes in needed GPU power will have erratic framerates.

I've never witnessed my Kepler card do this; my Fermi and NV30 behave sort of this way though. Most games IME have fairly fluctuated requirements - I've sat and watched this in GPU-Z and the GTX 660 doesn't stay at any single clock all the time, it moves up and down in response to the game.

Again, I do not care one lick about frame-rates above the monitor field rate, so if you're worried about 60 vs 80 vs 1300 FPS for benchmarking reasons, I can't help you there. But with standard vsync and GPU Boost enabled, I've had no problems whatsoever with the 660 (honestly no problems with vsync off either, but I don't like tearing).

I find it odd though that your GPU goes full blast all the time if you set to prefer maximum performance in all applications. These settings are only supposed to affect 3D programs, so if no 3D programs are running the GPU shouldn't be burning up more power. (And I can assure you, I have this set this way and my GPU is not running full blast all the time.) Now, something I should quickly point out: The nVidia control panel ITSELF is a 3D program, so to see the proper clocking effects of setting prefer maximum performance you need to close out the nVidia control panel entirely and make sure no other programs are running which require hardware accelerated graphics.

If you're still having the GPU stay ramped up and burning power then yeah, set it to adaptive globally and any time you run into a game where the framerate isn't perfect just go into the nVidia control panel, click into the recent programs list, select your game from the top and set it to prefer maximum performance.

Lots of applications that monitor clocks, like GPU-Z, will drive the clocks up too. Also if you have anything in the background that uses GPU resources, which can include things like video acceleration, desktop compositing (requires generally very high resolutions to see an impact), etc can be a problem as well.

Reply 6 of 30, by m1so

User metadata
Rank Member
Rank
Member

Sorry for adding to a month old thread, but does alt tabbing away from the app that is set to "Prefer maximum performance" change anything or the settings stay as long as the process is launched?

Reply 7 of 30, by obobskivich

User metadata
Rank l33t
Rank
l33t
m1so wrote:

Sorry for adding to a month old thread, but does alt tabbing away from the app that is set to "Prefer maximum performance" change anything or the settings stay as long as the process is launched?

What do you mean? Like if you tell it that notepad.exe should prefer maximum performance, will having notepad.exe open and minimized force it to max clocks all the time? Or do you mean if you told it for say, World of Warcraft, to prefer max performance and alt-tab out to check your email, will it restore max performance when WoW regains focus?

Either way I would say why not try it and found out, since it seems like you have this card, and it wouldn't be terribly hard to test. My guess is that the former situation wouldn't work (and it sounds like a very contrived way of trying to outsmart the stuff you're working with instead of using a genuinely functional feature), but it shouldn't have issues with the later - I never had problems alt-tab'ing from games that allow it and recovering whatever GPU Boost clocks I previously had. I would also again ask WHY you're so adamant on defeating the power management - it will go "full clock" in games that actually need that level of performance (and lots of older ones do not), and otherwise it will manage its power very effectively. That's what it's designed to do, and that's what it does.

Reply 8 of 30, by Gemini000

User metadata
Rank l33t
Rank
l33t

One thing I can add though is that if you task switch out of a game and leave it in a state where it doesn't need to render anything to the screen at all, (IE: minimized or with a different program maximized), the GPU will indeed start saving power until you bring the game back up, at which point the framerate will be terrible for a couple or three seconds until the GPU recognizes what's going on and kicks the power back in.

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 9 of 30, by Procyon

User metadata
Rank Member
Rank
Member

I use RivatunerStatisticsserver to cap my framerate to 65fps, my display is 60Hz so my card delivers maximum performance while it not has to produce unnessecary frames and it stays relatively cool.
I don't like to use v-sync because of inputlag and to be honest I've yet to play a game that has so much screentearing that I have to turn it on.

Reply 10 of 30, by Gemini000

User metadata
Rank l33t
Rank
l33t
Procyon wrote:

I use RivatunerStatisticsserver to cap my framerate to 65fps, my display is 60Hz so my card delivers maximum performance while it not has to produce unnessecary frames and it stays relatively cool.
I don't like to use v-sync because of inputlag and to be honest I've yet to play a game that has so much screentearing that I have to turn it on.

Err... if you're suffering input lag with vertical sync enabled then something is wrong or misconfigured... >_>;

There's extremely few reasons not to use vertical sync nowadays. It results in the smoothest gameplay possible and it cuts down on CPU usage to save power by not burning CPU power during the actual vsync operation.

If you're suffering problems using vSync, consider the following:

1. Update your drivers. Always do this first if you run into issues that are GPU related.

2. Reduce the maximum number of pre-rendered frames allowed. The default is up around 3, which is kind of absurd. 2 should be good enough to keep the framerate high and make the lag unnoticable, if the game even needs to get into pre-rendering frames. For older games which can't properly process pre-rendered frames, just disable this feature for them. They probably wouldn't have elaborate enough graphics to demand maximum GPU power anyways. :P

3. Double check when the game is running that it's actually running the correct refresh rate on your display. I've seen games trigger 24 Hz and 50 Hz modes on 60 Hz monitors, usually because those modes come first when requesting mode lists from the GPU and the game programmer figured to just choose the first refresh rate it gets, as opposed to looking through all refresh rates available for the selected resolution and picking the optimal one. The workaround to this is if you can't simply delete the 24 Hz and 50 Hz modes is to create a custom resolution that's 2 or 4 pixels smaller, set up the timing so that it simply cuts those rows or columns out instead of stretching the screen, then make sure that resolution ONLY has a 60 Hz mode available.

4. If you're using a TV instead of a PC monitor, make sure it's set to "Game Mode". The crazy thing is that most modern TVs will STILL have some lag even with game mode activated. >_<;

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 11 of 30, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

You always got / get input lag with V-sync 😀

If your monitor supports higher refresh rates try playing at 75 Hz because the input lag issues reduce the higher the refresh rate.

One benefit of V-sync on high powered graphics card is that it can reduce / eliminate coil whine.

YouTube, Facebook, Website

Reply 12 of 30, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie

They broke advanced shadows in The Sims 2 after 320.something,

Off topic but AMD broke their drivers in exactly the same way (black boxes on sims 2 with shadows set to high)

It must have been a fix for something else that both manufacturers did that inadvertently broke it. Annoying though

Reply 14 of 30, by Gemini000

User metadata
Rank l33t
Rank
l33t
philscomputerlab wrote:

You always got / get input lag with V-sync :)

If your monitor supports higher refresh rates try playing at 75 Hz because the input lag issues reduce the higher the refresh rate.

o_O

I feel like I'm being trolled since over 20 years of programming and gaming experience tells me that first point is a lie, although I can understand where some people might be coming from if they think it's true, since that second point is technically true, but only If the lag being experienced is specific to the number of frames being rendered. (IE: you're lagging by 3 frames.) Only then would having more frames per second reduce the lag experienced since each frame would go by faster.

As I alluded to though, modern graphics cards, even going back a whole decade, had the ability to "pre-render" frames. Early cards did this in the background and didn't have the option to adjust the setting, not to mention applied this to every game, sometimes creating odd problems. This is partly what led to programs like RivaTunner and such since people wanted to be able to adjust settings like these which weren't being provided to them by the drivers themselves and pre-rendered frames used to be one of the more insidious settings. Nowadays, pre-rendered frames can not only be adjusted in the graphics drivers themselves, but it's considered a "maximum" limit and only comes into effect if the framerate can be improved by rendering frames ahead of time. The default maximum limit on nVidia cards is 3, but bumping this down to 2 prevents "noticeable" lag from showing up. Input lag is only "noticeable" if you perceive a difference between when you hit a button and when the action occurs on-screen, which is ultimately what matters, unless you want to argue a point which requires you to be angry that the microseconds it takes for the video signal to reach the screen is not good enough. :P

BTW: Another optimization a lot of people don't know a lot about with nVidia cards is called "Threaded Optimization". Turning this option on will typically improve your framerate to a minor degree in more advanced games. However, simpler games don't benefit from this option, yet with this option enabled, those simpler games will burn several times more CPU power because of all the thread rebalancing going on. Some games will literally go from 25% CPU usage down to 1% if you disable this feature! :O

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 15 of 30, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

A lie? I'm sorry but you're wrong. V-sync has always given input lag. Any FPS gamer can tell you that.

Now you might not notice it but that just puts you into the same ignorance is bliss category as people not noticing a difference between 30 / 60 / 120 frames, watching a movie in 24p mode vs. 30p and so on.

YouTube, Facebook, Website

Reply 16 of 30, by eL_PuSHeR

User metadata
Rank l33t++
Rank
l33t++

Interesting read. I really hate screen tearing so that's why I always run with VSYNC ON + Tripple buffering, even if I notice a lot of input lag from my Kepler card (especially runing Legend of Grimrock 2. Any other games run perfectly fine for me, even FPS shooters (Counter-strike:GO, Left4Dead 2 etc).

I have switched pre-rendered frames setting from default to 2.

Intel i7 5960X
Gigabye GA-X99-Gaming 5
8 GB DDR4 (2100)
8 GB GeForce GTX 1070 G1 Gaming (Gigabyte)

Reply 17 of 30, by Gemini000

User metadata
Rank l33t
Rank
l33t
philscomputerlab wrote:

A lie? I'm sorry but you're wrong. V-sync has always given input lag. Any FPS gamer can tell you that.

Now you might not notice it but that just puts you into the same ignorance is bliss category as people not noticing a difference between 30 / 60 / 120 frames, watching a movie in 24p mode vs. 30p and so on.

I notice the difference between 30/60/120 frames. A good portion of my life was spent using a 120 Hz CRT monitor. :B

While "noticeable" is subjective, for me it's a VERY short duration. I've noticed lag on some people's systems where they perceive none. You can say v-sync has always given input lag all you want but here's a couple points that tears your argument to pieces:

1. In order to assume that vertical syncing is causing lag when the lack of vertical sync does not when the framerate for both scenarios is identical (or close enough), one must assume that the process of requesting a vsync is delaying the processing of standard I/O. Rest assured, this doesn't happen, and even if it did, the most you could possibly delay the I/O by would be a single frame, since once the vertical sync was over with, I/O would once again be processed. At 60 FPS this translates into about 0.0167 seconds. The vast majority of people on the planet are incapable of perceiving this short of a delay.

2. HUNDREDS of DOS games perform vertical sync operations, including classic FPS games such as Doom and Wolfenstein 3D. You're telling me that these games have had input lag since all the way back in the days of DOS? :P

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 18 of 30, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

On this page alone 3 users have observed input lag with V-sync. You are the only one saying it's a lie and that there is no input lag with v-sync enabled, or if there is then it's an issue with the system.

Gemini000 wrote:

for both scenarios is identical (or close enough)

Hand picked scenario and not realistic: Getting exactly 60 fps without V-sync and with V-sync. And ignoring the fact that fps halves when going under the refresh rate which is THE situation that you do NOT want to use V-sync at (fps close to refresh rate).

2. HUNDREDS of DOS games perform vertical sync operations, including classic FPS games such as Doom and Wolfenstein 3D. You're telling me that these games have had input lag since all the way back in the days of DOS? 😜

No relation to GeForce whatsoever.

The vast majority of people on the planet are incapable of perceiving this short of a delay.

🤣 Which is exactly the point I made (ignorance is bliss). Although I see it more as something they haven't been shown / don't know what to look for rather than claiming it doesn't exist.

And of course 16 ms are noticeable, you yourself say that you notice the difference from 60 (every 16 ms) to 120 Hz (every 8 ms).

eL_PuSHeR wrote:

Interesting read. I really hate screen tearing so that's why I always run with VSYNC ON + Tripple buffering, even if I notice a lot of input lag from my Kepler card (especially runing Legend of Grimrock 2. Any other games run perfectly fine for me, even FPS shooters (Counter-strike:GO, Left4Dead 2 etc). I have switched pre-rendered frames setting from default to 2.

These are good tweaks. V-Sync and triple buffering is as good as it gets. If you want even better results there is G-sync but a quite expensive solution. I've never seen a G-sync display in action but display sensitive people rave about it. And some game engines are worse than others.

YouTube, Facebook, Website

Reply 19 of 30, by Gemini000

User metadata
Rank l33t
Rank
l33t
philscomputerlab wrote:

On this page alone 3 users have observed input lag with V-sync. You are the only one saying it's a lie and that there is no input lag with v-sync enabled.

I didn't say anyone was wrong about that. One of the first things I said was:

Gemini000 wrote:

Err... if you're suffering input lag with vertical sync enabled then something is wrong or misconfigured... >_>;

THAT is what matters. I've been using vsync for a very long time now across many different systems and the first time I noticed lag was after downloading new video drivers many years ago, and when I investigated why I discovered the whole pre-rendered frames thing, altered that, and haven't suffered lag since.

To put this in perspective, say you're playing a game where you can hit 240 FPS on a display that can only handle 60 FPS and you've got 4 pre-rendered frames going at a time. With vsyncing, each frame would be 0.0167 seconds long, so it takes 0.0667 seconds for any input to register. With vsyncing off, it takes 0.0167 seconds to process 4 frames at 240 FPS, thus the input registers four times faster. This isn't the fault of vsync though, it's the fault of the pre-rendered frames, since if there were no pre-rendered frames at all, everything would happen instantly.

So I guess if you want to be REALLY specific, vsyncing and pre-rendered frames working together are what causes lag. Turn either or both off and the lag disappears, but with both on you will get lag. The thing is, pre-rendered frames is adjustable and can still cause problems, especially in games that are going over the monitor refresh rate, but not by much, or with older games that don't recognize the "maximum" capacity of pre-rendered frames and will just render as many as allowed all the time no matter the framerate.

This is why I consider vsyncing itself not to be the issue.

philscomputerlab wrote:

No relation to GeForce whatsoever.

Ahem...

philscomputerlab wrote:

You always got / get input lag with V-sync :)

Use of the word "got" suggests that when you said "always", you meant this has been a problem since the dawn of vertical sync in PC gaming and not just a GeForce issue.

Also when you said...

philscomputerlab wrote:

Hand picked scenario and not realistic: Getting exactly 60 fps without V-sync and with V-sync.

...I was actually relating to this post in the thread:

Procyon wrote:

I use RivatunerStatisticsserver to cap my framerate to 65fps, my display is 60Hz so my card delivers maximum performance while it not has to produce unnessecary frames and it stays relatively cool.
I don't like to use v-sync because of inputlag and to be honest I've yet to play a game that has so much screentearing that I have to turn it on.

So here's my challenge to you, Phil: Load up a first person shooter that uses as little GPU power as possible, like maybe Quake III or something along those lines, and with your settings just the way you want them, measure how easily you can play the game. Then, go into your graphics driver settings, set pre-rendered frames to 0 or 1 (whichever is the lowest, do NOT pick "default" or "auto"), turn on vertical sync, turn on triple buffering, go into your game, turn on vsync in the game's options, shutdown and restart the game so vsyncing can properly take effect, then once again, play it and measure how easily you can play the game.

One last thing I should point out: Some game producers have answered the problem of input lag by, instead of educating gamers to adjust settings to reduce/eliminate it, just force vsync off where possible and cap out the framerate at insane amounts that no video card can physically render. As a programmer, this behavior irritates me because it's solving a problem that shouldn't exist simply by throwing more processing power than needed at it, but then at the same time, the average computer user doesn't know enough about computers to understand the problem and would rather lash out at a developer blaming them for making an inferior product, rather than just change a couple of settings and making something work perfectly. At the same time though, I think a lot of game developers DO recognize this, because they often still leave vsync settings in and sometimes allow adjusting the framerate cap. With every game I've acquired over the past 16 years, I optimize for vsynced gameplay and get virtually no input lag EVERY time.

I also really hate the whole "Game Mode" thing with TVs because it just adds more lag and confusion to this whole mess and constantly has me worried that computer monitors may one day be laggy too. x_x;

philscomputerlab wrote:

Which is exactly the point I made (ignorance is bliss). Although I see it more as something they haven't been shown / don't know what to look for rather than claiming it doesn't exist.

And of course 16 ms are noticeable, you yourself say that you notice the difference from 60 (every 16 ms) to 120 Hz (every 8 ms).

Not a contradiction. I visually perceive the difference between 60 and 120 Hz, but the margin of error in the actions I perform on a controller or keyboard or mouse is much larger, so I can endure a frame or two of lag without noticing. I cannot perceive 1 frame of input lag. 2 frames I can perceive, but only if I'm specifically trying to. 3 frames I can and always perceive and cannot control a game properly with that much lag or more.

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg