philscomputerlab wrote:On this page alone 3 users have observed input lag with V-sync. You are the only one saying it's a lie and that there is no input lag with v-sync enabled.
I didn't say anyone was wrong about that. One of the first things I said was:
Gemini000 wrote:Err... if you're suffering input lag with vertical sync enabled then something is wrong or misconfigured... >_>;
THAT is what matters. I've been using vsync for a very long time now across many different systems and the first time I noticed lag was after downloading new video drivers many years ago, and when I investigated why I discovered the whole pre-rendered frames thing, altered that, and haven't suffered lag since.
To put this in perspective, say you're playing a game where you can hit 240 FPS on a display that can only handle 60 FPS and you've got 4 pre-rendered frames going at a time. With vsyncing, each frame would be 0.0167 seconds long, so it takes 0.0667 seconds for any input to register. With vsyncing off, it takes 0.0167 seconds to process 4 frames at 240 FPS, thus the input registers four times faster. This isn't the fault of vsync though, it's the fault of the pre-rendered frames, since if there were no pre-rendered frames at all, everything would happen instantly.
So I guess if you want to be REALLY specific, vsyncing and pre-rendered frames working together are what causes lag. Turn either or both off and the lag disappears, but with both on you will get lag. The thing is, pre-rendered frames is adjustable and can still cause problems, especially in games that are going over the monitor refresh rate, but not by much, or with older games that don't recognize the "maximum" capacity of pre-rendered frames and will just render as many as allowed all the time no matter the framerate.
This is why I consider vsyncing itself not to be the issue.
philscomputerlab wrote:No relation to GeForce whatsoever.
Ahem...
philscomputerlab wrote:You always got / get input lag with V-sync :)
Use of the word "got" suggests that when you said "always", you meant this has been a problem since the dawn of vertical sync in PC gaming and not just a GeForce issue.
Also when you said...
philscomputerlab wrote:Hand picked scenario and not realistic: Getting exactly 60 fps without V-sync and with V-sync.
...I was actually relating to this post in the thread:
Procyon wrote:I use RivatunerStatisticsserver to cap my framerate to 65fps, my display is 60Hz so my card delivers maximum performance while it not has to produce unnessecary frames and it stays relatively cool.
I don't like to use v-sync because of inputlag and to be honest I've yet to play a game that has so much screentearing that I have to turn it on.
So here's my challenge to you, Phil: Load up a first person shooter that uses as little GPU power as possible, like maybe Quake III or something along those lines, and with your settings just the way you want them, measure how easily you can play the game. Then, go into your graphics driver settings, set pre-rendered frames to 0 or 1 (whichever is the lowest, do NOT pick "default" or "auto"), turn on vertical sync, turn on triple buffering, go into your game, turn on vsync in the game's options, shutdown and restart the game so vsyncing can properly take effect, then once again, play it and measure how easily you can play the game.
One last thing I should point out: Some game producers have answered the problem of input lag by, instead of educating gamers to adjust settings to reduce/eliminate it, just force vsync off where possible and cap out the framerate at insane amounts that no video card can physically render. As a programmer, this behavior irritates me because it's solving a problem that shouldn't exist simply by throwing more processing power than needed at it, but then at the same time, the average computer user doesn't know enough about computers to understand the problem and would rather lash out at a developer blaming them for making an inferior product, rather than just change a couple of settings and making something work perfectly. At the same time though, I think a lot of game developers DO recognize this, because they often still leave vsync settings in and sometimes allow adjusting the framerate cap. With every game I've acquired over the past 16 years, I optimize for vsynced gameplay and get virtually no input lag EVERY time.
I also really hate the whole "Game Mode" thing with TVs because it just adds more lag and confusion to this whole mess and constantly has me worried that computer monitors may one day be laggy too. x_x;
philscomputerlab wrote:Which is exactly the point I made (ignorance is bliss). Although I see it more as something they haven't been shown / don't know what to look for rather than claiming it doesn't exist.
And of course 16 ms are noticeable, you yourself say that you notice the difference from 60 (every 16 ms) to 120 Hz (every 8 ms).
Not a contradiction. I visually perceive the difference between 60 and 120 Hz, but the margin of error in the actions I perform on a controller or keyboard or mouse is much larger, so I can endure a frame or two of lag without noticing. I cannot perceive 1 frame of input lag. 2 frames I can perceive, but only if I'm specifically trying to. 3 frames I can and always perceive and cannot control a game properly with that much lag or more.