VOGONS

Common searches


Nvidia "G-Sync" announced

Topic actions

First post, by d1stortion

User metadata
Rank Oldbie
Rank
Oldbie

http://www.geforce.com/whats-new/articles/int … ter-free-gaming

Looks very promising! Screen tearing is certainly the most annoying issue today with games and this has the potential to eliminate it without causing atrocious VSync lag. Too bad this requires a new monitor though. Hopefully they will put out IPS panels with this even if it means only 60 Hz.

Reply 1 of 35, by Gemini000

User metadata
Rank l33t
Rank
l33t

As a programmer, I've been wondering for a long time when this would finally happen. The moment I read the name "G-Sync" I didn't even have to really read the article to know what was going on, then read the article and yup, exactly what I thought it was: Dynamic syncing of the monitor to the graphics device instead of syncing the graphics device to the monitor. :)

So long as nVidia makes sure this standard can be easily adopted, screen tearing and response issues will become a thing of the past before too long. ;)

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 2 of 35, by j7n

User metadata
Rank Newbie
Rank
Newbie

I've never been bothered by screen tearing in games. If a game couldn't be played at smooth smooth framerate above 40-ish fps with V-sync, that's a game that was out of reach for me at the moment. I scanned the article for movie playback but didn't see it mentioned. Stuttering and/or tearing of "old tech" videos in a wide range of framerates is where the issue occurs most often.

If this G-sync is to become a standard, would one day normal v-sync become unsupported much like 16-bit color is today, reducing legacy games to youtube-like slideshow? I'm thinking of those kinds where the entire screen or large sections of it scroll or pan smoothly.

the biggest leap forward in gaming monitors since we went from standard definition to high-def

As I recall it, computer monitors increased their resolution gradually, and there was no leap. It did quite rapidly become norm to game in insane resolutions, though (which is but one application of a monitor). And experiencers of the "leap" are more like slaves to their "full HD" LCD monitors, being forced to play without anti-aliasing or, indeed, without v-sync.

Reply 3 of 35, by Gemini000

User metadata
Rank l33t
Rank
l33t
j7n wrote:

I've never been bothered by screen tearing in games.

Probably not, but consider that even when triple-buffering a game, if the framerate is not a precise multiple of your monitor refresh rate, certain frames will last longer on-screen than other frames and there will be time skewing between the visible action in them. This results in a perceived DECREASE in the framerate below what it actually is at. 37 FPS on a 60 Hz display looks like it's going slower than 30 FPS because of how horribly matched it is to the screen refresh rate. 24 FPS, the default for movies, also ends up being perceived at a lower rate on a 60 Hz monitor over a 120 Hz monitor for the same reason, since 60 cannot be divided by 24 evenly, but 120 can! :o

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 4 of 35, by d1stortion

User metadata
Rank Oldbie
Rank
Oldbie
j7n wrote:

I've never been bothered by screen tearing in games. If a game couldn't be played at smooth smooth framerate above 40-ish fps with V-sync, that's a game that was out of reach for me at the moment. I scanned the article for movie playback but didn't see it mentioned. Stuttering and/or tearing of "old tech" videos in a wide range of framerates is where the issue occurs most often.

If this G-sync is to become a standard, would one day normal v-sync become unsupported much like 16-bit color is today, reducing legacy games to youtube-like slideshow? I'm thinking of those kinds where the entire screen or large sections of it scroll or pan smoothly.

There is a major misunderstanding going on here... first of all, try playing some GoldSrc games (HL, CS 1.6 etc.) on a modern PC and then tell me again the tearing doesn't bother you. It's horrendous in those titles, at least in my experience (which kinda made me regret my PC upgrade at the time...).

Secondly, the problem with VSync is not that it cuts down performance but that it introduces a lot of input lag. Usually in shooters moving the mouse will feel like you are moving through water or something, making precise aiming impossible. With a gamepad it might be less noticeable, however.

Actually they had an "Adaptive VSync" thing going on with their current generation of GPUs but apparently that doesn't cut it so they came up with this hardware thing.

Finally, this G-Sync won't become an open standard cuz it's Nvidia we are talking about. But regular VSync is going nowhere if that's what you like, you are able to force it through the driver now and will be in the future. The problem is just that for many people this doesn't cut it and they prefer tearing over not being able to aim, including me.

Last edited by d1stortion on 2013-10-19, 02:42. Edited 3 times in total.

Reply 6 of 35, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

😀

Good to some progress!

Still quite weird seeing that a decent CRT still runs circles around current monitors.

The power of instantaneous analogue 😁

OLED is still dragging its feet...

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 7 of 35, by d1stortion

User metadata
Rank Oldbie
Rank
Oldbie

Yeah you can't beat CRTs period. Sadly the mainstream is about ergonomics and that's why we are stuck with this LCD crap. FED/SED is dead before it could even arrive and OLED is only for the professional market right now. Heck they've taken it out of the new Vita...

Reply 8 of 35, by m1so

User metadata
Rank Member
Rank
Member
d1stortion wrote:

Yeah you can't beat CRTs period. Sadly the mainstream is about ergonomics and that's why we are stuck with this LCD crap. FED/SED is dead before it could even arrive and OLED is only for the professional market right now. Heck they've taken it out of the new Vita...

Quoted for truth. Finally someone who said what was on my mind for such a long time.

Reply 9 of 35, by Gemini000

User metadata
Rank l33t
Rank
l33t
d1stortion wrote:

Yeah you can't beat CRTs period.

Unless you're talking power consumption, level of danger to repair/refurbish, image stability, flickering, phosphor fade with prolongued use, image burn, image quality when recorded with a camera, pixel sharpness deterioration with age, physical weight, physical size... I could go on. XD

Don't get me wrong though, CRTs had their time and some of the things LCD/LED monitors have only just recently in the past few years been capable of doing, CRTs could do way back when LCD technology was still ghosting like crazy. :P

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 10 of 35, by d1stortion

User metadata
Rank Oldbie
Rank
Oldbie
m1so wrote:
d1stortion wrote:

Yeah you can't beat CRTs period. Sadly the mainstream is about ergonomics and that's why we are stuck with this LCD crap. FED/SED is dead before it could even arrive and OLED is only for the professional market right now. Heck they've taken it out of the new Vita...

Quoted for truth. Finally someone who said what was on my mind for such a long time.

I have a Sony BVM monitor. With RGB input it simply destroys my 23" 1080p Eizo FS2333 w/ IPS panel in terms of color fidelity, viewing angles and black level hands down. Mainstream IPS panels are nowadays still stuck with 6-bit+dithering so the display definitely introduces banding, despite what reviewers want me to believe. And while the TN blackening is gone there is the typical IPS glow, hence why the viewing angles are still not perfect. It's a crying shame really...

Gemini000 wrote:

Unless you're talking power consumption, level of danger to repair/refurbish, image stability, flickering, phosphor fade with prolongued use, image burn, image quality when recorded with a camera, pixel sharpness deterioration with age, physical weight, physical size... I could go on. 🤣

Some of those points can be subsumed under "ergonomics". And I find that even old CRTs with phosphor ghosting and all of those standard issues still beat a (good!) modern LCD in some IQ aspects, which makes it even more embarassing. Old LCD CCFLs (yes, before they had LEDs for backlight) would fade too btw but yeah you could replace them more easily. It's moot talking about durability anyway as thanks to RoHS I would not expect modern electronics to survive for 25 years.

I do prefer LCDs for applications such as reading text/work, etc. however. But their limitations when it comes to entertainment are obvious.

Reply 11 of 35, by PhaytalError

User metadata
Rank Member
Rank
Member

The caveat of this "Gsync" board and "Gsync" monitors is that it requires a GeForce GTX 650 Ti Boost or higher GPU only.

DOS Gaming System: MS-DOS, AMD K6-III+ 400/ATZ@600Mhz, ASUS P5A v1.04 Motherboard, 32 MB RAM, 17" CRT monitor, Diamond Stealth 64 3000 4mb PCI, SB16 [CT1770], Roland MT-32 & Roland SC-55, 40GB Hard Drive, 3.5" Floppy Drive.

Reply 12 of 35, by Gemini000

User metadata
Rank l33t
Rank
l33t
PhaytalError wrote:

The caveat of this "Gsync" board and "Gsync" monitors is that it requires a GeForce GTX 650 Ti Boost or higher GPU only.

That's probably a limitation of getting the new tech working with current video cards, which were developed beforehand. I'm going to guess any new cards nVidia produces from this point forwards will be G-Sync capable. ;)

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 13 of 35, by 133MHz

User metadata
Rank Oldbie
Rank
Oldbie
Gemini000 wrote:
d1stortion wrote:

Yeah you can't beat CRTs period.

Unless you're talking power consumption, level of danger to repair/refurbish

CRT:Line voltage danger at the main filter capacitors, high voltage danger at the flyback transformer output. Cathode ray tube may break due to mishandling, ruining your day.
LCD:Line voltage danger at the main filter capacitors, high voltage danger at the CCFL inverter output. Liquid crystal panel may break due to mishandling, ruining your day.

I don't see much of a difference in that regard. Well, except that a big CRT could give you a hernia 😵

http://133FSB.wordpress.com

Reply 14 of 35, by sliderider

User metadata
Rank l33t++
Rank
l33t++
PhaytalError wrote:

The caveat of this "Gsync" board and "Gsync" monitors is that it requires a GeForce GTX 650 Ti Boost or higher GPU only.

But will it work with AMD video cards? That's the big question. I don't think anyone is going to buy a monitor that locks you in with nVidia forever.

133MHz wrote:

I don't see much of a difference in that regard. Well, except that a big CRT could give you a hernia 😵

You think this CRT might give you a hernia?

monitor.jpg

And if that's not big enough for you, I used to have a 40" CRT that once sat alongside a piece of industrial equipment. I had to get rid of it, though, because it took up too much space, gave off enough heat that I didn't have to turn on the furnace in winter, and was only 800x600 resolution anyway.

Reply 15 of 35, by d1stortion

User metadata
Rank Oldbie
Rank
Oldbie
PhaytalError wrote:

The caveat of this "Gsync" board and "Gsync" monitors is that it requires a GeForce GTX 650 Ti Boost or higher GPU only.

No, I'd rather say the caveat is that you need to get a new expensive monitor, and something tells me most of them will be only 120hz TN "gaming" monitors. Usually the video card winds up being upgraded after a while anyway. With monitors it's different though, if you have some expensive 1920x1200 monitor from 2005 you can just keep using that since development is really stagnant when it comes to LCD monitors.

Reply 16 of 35, by Gemini000

User metadata
Rank l33t
Rank
l33t
sliderider wrote:

But will it work with AMD video cards? That's the big question. I don't think anyone is going to buy a monitor that locks you in with nVidia forever.

G-Sync is an extra feature so you could still use regular syncing on a G-Sync-capable monitor. So a G-Sync capable monitor will still work with AMD cards, just without the benefits G-Sync would provide. :B

However you slice it, syncing monitors to the GPU and not vice versa is a step in the right direction. ;)

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 17 of 35, by eL_PuSHeR

User metadata
Rank l33t++
Rank
l33t++

I just hope this isn't another nVidia bullshit. I like freedom of choice. And I hope the thing will work better than their terrible 300.xx series drivers. A lot of people cannot update their drivers past 314.22 for seven months now.

Intel i7 5960X
Gigabye GA-X99-Gaming 5
8 GB DDR4 (2100)
8 GB GeForce GTX 1070 G1 Gaming (Gigabyte)

Reply 18 of 35, by d1stortion

User metadata
Rank Oldbie
Rank
Oldbie

Huh? They are at 331.58 right now, why should you not be able to update?

Also I still don't think that AMD has better drivers, given that you can't even set maximum settings in some games...

Reply 19 of 35, by eL_PuSHeR

User metadata
Rank l33t++
Rank
l33t++

offtopic:

because many people (specially with Fermi cards) are getting lot of stability problems when updating beyond 314.22

And back on topic...

Newer nVidia driver has an "adpative VSYNC" option. I have to try that. As for the hardware thing (G-SYNC) I have seen that it comes with ASUS monitors so far. I pretty dislike ASUS. In fact, I have an ASUS monitor at work and it's one of the worst I have ever seen. So no ASUS for me, thank you. And my main issue is that this tech is probably propietary. Unless it should be adopted by VESA or ISO standards institute is a no go for me.

Intel i7 5960X
Gigabye GA-X99-Gaming 5
8 GB DDR4 (2100)
8 GB GeForce GTX 1070 G1 Gaming (Gigabyte)