VOGONS

Common searches


Geforce power managment mode

Topic actions

Reply 20 of 30, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

With each post you are just throwing more complexity at the topic but none change the fact that what you said is simply wrong.

Weird because in essence you agree with me and summarising your elaboration it boils down to saying that V-sync does add input lag, there are measures to reduce that impact and that some people cannot notice it any more after tweaking but some can. Which is exactly what I've been saying, concluding that for you it's not noticeable after tweaking and to me and many others it still is.

Saying that this worked for all games can be challenged because I believe that triple buffering cannot be enabled to every game / engine or can result in a worsening outcome. I also still easily notice the difference, even with triple buffering and pre-rendering set to 1 for example. Also, at least on my GTX660 with latest driver, there is no setting of 0.

Adding to the fact that this setting (pre-rendering) doesn't apply to SLI adds to the issue that ones personal experiences don't necessarily apply to others.

For anyone wanting to know more about this interesting topic there is an excellent article from Anandtech: http://www.anandtech.com/show/2794/2

YouTube, Facebook, Website

Reply 21 of 30, by Gemini000

User metadata
Rank l33t
Rank
l33t
philscomputerlab wrote:

With each post you are just throwing more complexity at the topic but none change the fact that what you said is simply wrong.

Weird because in essence you agree with me and summarising your elaboration it boils down to saying that V-sync does add input lag, there are measures to reduce that impact and that some people cannot notice it any more after tweaking but some can. Which is exactly what I've been saying, concluding that for you it's not noticeable after tweaking and to me and many others it still is.

...you know, I run into this weird situation with the rare person where I can never seem to be on the same page as them, even though we both know what we're talking about. I think you're one of those people... Sorry about any future arguments we're bound to have as a result. >;D

For me, whenever someone says they're experiencing "input lag" I imagine the worst case scenarios I've witnessed where there's almost a quarter-second delay between pressing a button and an action happening. To that end, I find it difficult to comprehend that anyone could perceive an interval faster than a couple hundredths of a second when trying to perform an action... but now I'm thinking back to some of those videos I've seen with people clearing some of the most insanely difficult modes in arcade games, or people playing fighting games and have the timing down extremely tight.

It really does come down to perception of lag and regardless of everything we've both said, I would always recommend someone try turning down pre-rendered frames and enabling triple buffering before they try disabling vsync, because that might just be good enough for them and will result in zero screen shearing. :B

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 22 of 30, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

Yes that's what it boils down to. But it's good to hear both sides and everyone can check it out for themselves.

YouTube, Facebook, Website

Reply 23 of 30, by eL_PuSHeR

User metadata
Rank l33t++
Rank
l33t++

I also have a Geforce GTX 660 Ti and after reading your posts and doing extensive testing I agree that Legend of Grimrock 2's input lag is almost non existant when prerendered frames are set to 1 (yep, there is no 0 value under nVidia's control panel). So it seems input lag is directly related to prerendered frames, at least for LoG2. As for the other games, I don't notice any input lag at all. I am pretty sure D3D9 is also a lot to blame here. It's not the most optimized graphical API you know. I am not a programmer but it seems its execution pipeline and other things are quite bad.

Intel i7 5960X
Gigabye GA-X99-Gaming 5
8 GB DDR4 (2100)
8 GB GeForce GTX 1070 G1 Gaming (Gigabyte)

Reply 24 of 30, by Gemini000

User metadata
Rank l33t
Rank
l33t

The way a game is programmed factors heavily into it as well and with a lot of modern games using other people's engines, some of which can be pretty old by this point, they may not be optimized or written to properly handle some of the optimization settings available with graphics drivers.

Pre-rendered frames is one of the biggest ones. When it was introduced a little over a decade ago, virtually nothing could handle it properly, not even benchmarking software! I remember testing out one of the 3DMark programs at the time (the one with the whole Matrix-inspired lobby sequence) and noticing in the tests that if pre-rendered frames were on it would just render the same frame multiple times and end up killing the framerate visually, even though it was REPORTING a higher rate because it really was rendering multiple frames! :O

You do have to be careful though. Some modern games heavily rely on pre-rendered frames to keep the framerate going because of how they do their processing and rendering. Poor framerates can sometimes be solved simply by adding an additional pre-rendered frame... but then you've got the whole lag thing to worry about so it becomes trying to find a balance between framerate and input lag. :P

Threaded Optimization is another setting that can cause weird issues but only with extremely few games. The most this setting usually does wrong is dramatically increase CPU usage for games which normally use very little. :P

Anisotropic Filtering usually makes things look really good, at least in 3D, but it can cause strange artifacts in some 2D games and can make thin lines on textures look blurry. Doesn't matter for most 2D games but is something to watch out for.

One thing that can dramatically change performance on all cards is the "Texture Filtering Quality" setting. This setting has been around for a long time and essentially the more you set for performance over quality, the better your framerate will be but textures will be blurrier at closer distances. The funny thing is, this setting alone can create a MASSIVE difference in framerate. Nowadays, the high quality setting usually is still a decent enough FPS, but in the early 2000s this setting could mean the difference between 60 FPS and 10 FPS! :O

And of course, there's the setting this thread was started over: Power Management. As I said earlier, most games handle this well enough. The challenge is when games occasionally switch between graphical situations which require tons of power versus situations which require very little, as this confuses the GPU and may cause heavily decreased framerates when returning to complex rendering from simple rendering until it finally clues in and realizes that power is needed and ramps it back up. :P

I've also noticed, being on a Windows 8.1 system now, that the further back you go in your compatibility mode setting, the worse the framerates get. With any games using the Unity engine, I have to set compatibility to Windows 7, otherwise I get extremely strange visual stuttering or framerate drops, but this in turn decreases the maximum performance I can expect slightly. Setting for Windows XP drops the maximum performance I can expect dramatically, though of the extremely few games I've run into with issues, setting to Windows XP compatibility doesn't fix them anyways. :P

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 27 of 30, by obobskivich

User metadata
Rank l33t
Rank
l33t

Yes, it should recover the "prefer maximum" when the game is restored. There may be a slight hang-time while it re-adjusts, especially if the game doesn't immediately restore. For example Empire: Total War will let you alt+tab out, but it will force a loading screen when it regains focus - the GeForce will not return to max clocks on that loading screen, but instead when the game resumes after the loading ends.

Reply 28 of 30, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie
Gemini000 wrote:

Also, if you both force vertical sync on and enable triple buffering this will further prevent power from being used when it isn't necessary by idling the GPU between frames when it's done rendering, but still waiting to display the next frame because it's running faster than the display can physically handle. This won't affect performance although any game which can't maintain a perfect 60 FPS may drop to 30 FPS instead of settling somewhere between those two extremes.

Does this just reduce power consumption on Nvidia cards, or will it work on Radeons as well?

Reply 29 of 30, by Gemini000

User metadata
Rank l33t
Rank
l33t
m1so wrote:

If I set the general management to adaptive and prefer maximum performance for a certain game, will it stay for the game even if I alt+tab meanwhile?

Well, as said before, setting between adaptive and maximum performance shouldn't be having an effect with no programs running. It suggests something running in the background is accessing 3D acceleration and thus triggering the card to power up.

However, yes, if you set a game to prefer maximum performance and set the general setting to adaptive, when the game runs the GPU will go up to maximum power and when you tab out, so long as the game is still rendering, it will continue to use maximum performance. The thing to keep in mind though is that if you completely minimize a game it may stop rendering, thus the system may go back to adaptive mode until you restore the game. Some games will even stop rendering the moment they don't have the focus anymore, in which case, same effect.

BTW: Here's an interesting tid-bit of programming info which leads into me preferring OpenGL over Direct3D: D3D is designed to completely forget the contents of video memory when a program loses its rendering context. IE: The moment it gets minimized. This is why some games have a delay when you alt-tab back to them. OpenGL does NOT have this issue, so you can alt-tab back to an OpenGL game instantly. I don't actually know the reason why Direct3D was made this way, nor do I know why this behavior has never been changed despite the fact nobody who programs games likes it. :P

mr_bigmouth_502 wrote:

Does this just reduce power consumption on Nvidia cards, or will it work on Radeons as well?

Depends on the drivers, but it should. Just keep Task Manager open, play a game for a few moments, and see what the results are on CPU usage when using and not using vsync. :B

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 30 of 30, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie

I managed to find the "tear free" option in my FGLRX settings, I just had to run the administrative version with sudo. (I'm on Linux FYI 😜) From what I understand, this is the equivalent of vsync with triple buffering. It certainly helps fix the tearing I get when I scroll through webpages, but at the cost of some smoothness.