20 votes

G-Sync/Freesync - What's your opinion?

This was tempting to post in ~games but think it suits ~tech better.

What are your thoughts on these monitor frame sync technologies?
Have they made a big difference to your gaming experiences?
Could you do with out it?
What about G-Sync vs Freesync?

20 comments

  1. [2]
    teaearlgraycold
    Link
    I think variable refresh rate really shines when you're between 40 and 60 FPS. If your GPU can't quite hit a smooth 60 FPS then having a clean 45 FPS is so much better than the temporal aliasing...

    I think variable refresh rate really shines when you're between 40 and 60 FPS. If your GPU can't quite hit a smooth 60 FPS then having a clean 45 FPS is so much better than the temporal aliasing from the "frame, frame, frame, hitch, frame, frame, frame, hitch" cycle on traditional VSync.

    15 votes
    1. timo
      Link Parent
      Agreed. Did you know that the frametime for 40fps is right the middle of 30fps and 60fps? Yo only need 10fps more to get halfway to the smoothness of 60fps :) Fps Frametime frametime ratio (vs...

      Agreed. Did you know that the frametime for 40fps is right the middle of 30fps and 60fps? Yo only need 10fps more to get halfway to the smoothness of 60fps :)

      Fps Frametime frametime ratio (vs 30fps)
      30 33.33 1
      40 25 0.75
      45 22.22 0.67
      60 16.67 0.5
      8 votes
  2. [11]
    kaffo
    Link
    Around 3 years ago I bought myself a 1440p Freesync Asus monitor to pair with my 3080. I was interested to see if it made any difference after reading some positive reports. To my dismay I was...

    Around 3 years ago I bought myself a 1440p Freesync Asus monitor to pair with my 3080. I was interested to see if it made any difference after reading some positive reports.
    To my dismay I was disappointed. I found that many games would cause more stutters and the few that didn't I didn't find any real noticeable difference in experience (input lag nor tearing)
    This comes from a person who's never really been bothered by tearing mind you. I turn v-sync off every game and it's never bothered me, but perhaps I'm missing something!

    I've heard G-Sync is more robust, but it's very expensive and I don't see the reason why the average gamer would splash out for it.

    With that in mind, I'm keen to hear other user's experience. I'm curious if I just got a trashy monitor. Maybe playing at 1440p is just a mistake when trying to use this tech. Maybe I configured it incorrectly (although I'm pretty sure I did it correctly!)

    4 votes
    1. [3]
      babypuncher
      (edited )
      Link Parent
      This isn't really true anymore, but it was a problem for a good while. You really shouldn't be getting more stutters with either Freesync or G-Sync though, and it could boil down to improper...

      I've heard G-Sync is more robust, but it's very expensive and I don't see the reason why the average gamer would splash out for it.

      This isn't really true anymore, but it was a problem for a good while.

      You really shouldn't be getting more stutters with either Freesync or G-Sync though, and it could boil down to improper configuration, which is unfortunately not as straightforward as it should be.

      First, for ideal behavior on Nvidia GPUs, you want to set vsync to ON in the Nvidia Control Panel, and turn it OFF in-game for most titles. Second, it's useful to set a framerate cap in NVCP to a few frames below the maximum refresh rate of your panel. (source, along with more setup tips and explanations). Windows also has its own VRR setting you will want to turn on to ensure games that run in "borderless window/fullscreen" modes work properly.

      A couple more things to note: if your game has wild variances in frametimes, VRR won't really fix that. If the game likes to bounce around a lot between, say, 60 and 140 FPS, you might get a better and more consistent feeling experience capping it to 90. VRR also won't do anything to smooth over things like traversal or shader compilation stutter. There are also some (fortunately rare) titles that just don't play nice with VRR at all, and often require some extra coaxing to get a good experience.

      7 votes
      1. PleasantlyAverage
        Link Parent
        Adding to this, the reason for the frame rate limit is so that you don't experience the v-sync delay, and from what I've heard it can be better to use the, if provided, in-game frame rate limiter...

        Adding to this, the reason for the frame rate limit is so that you don't experience the v-sync delay, and from what I've heard it can be better to use the, if provided, in-game frame rate limiter since the game engine can pace its frames better than an external one.
        Also it sounds weird but v-sync and g-sync are complementary technologies. G-sync varies the frame rate of the monitor to only update if a new frame is rendered, and v-sync synchronizes the frames so each one is fully displayed. Without v-sync tearing can still occur.

        Tearing is also something that gets less noticeable the more frames are being rendered, as the difference between each one gets less and less. So if you mostly play games with high framerates then you wouldn't necessarily notice a strong difference. Especially when the framerate exceeds the refresh rate as g-sync is then not active.

        5 votes
      2. sqew
        Link Parent
        Thank you for that link to Blur Busters! It's wild to me how much arcane knowledge goes into configuring graphics stuff, so it's helpful to have a rundown article to stash away. I guess the...

        Thank you for that link to Blur Busters! It's wild to me how much arcane knowledge goes into configuring graphics stuff, so it's helpful to have a rundown article to stash away.

        I guess the complexity makes sense given that it's governed by the zillion interactions between game engine, drivers, hardware, display, etc., etc., but it's still crazy.

        1 vote
    2. [7]
      vord
      Link Parent
      In fairness, NVIDIA has always been pushy about using their own proprietary stuff instead of using open standards like Freesync, it's one more reason I hate them. Here's a good summary of where we...

      In fairness, NVIDIA has always been pushy about using their own proprietary stuff instead of using open standards like Freesync, it's one more reason I hate them.

      Here's a good summary of where we stood in 2019. I found that first and edited post after realizing it was somewhat outdated, but they link to one of their more recent updates.

      Nowadays most gaming monitors now support variable refesh rates (see the Steam Deck), NVIDIA has relented to supporting things other than their proprietary stack.

      6 votes
      1. babypuncher
        Link Parent
        It's a double-edged sword. The early days of G-Sync were great, because every G-Sync monitor supported a wide VRR range, variable pixel overdrive, LFC, and other features that are really necessary...

        NVIDIA has always been pushy about using their own proprietary stuff instead of using open standards like Freesync

        It's a double-edged sword. The early days of G-Sync were great, because every G-Sync monitor supported a wide VRR range, variable pixel overdrive, LFC, and other features that are really necessary for a good VRR experience. The early Freesync market on the other hand was flooded with cheap monitors that lacked these features and had very narrow VRR windows that capped out well below the maximum supported fixed refresh rate of the same monitor. I think this disparity is almost entirely because Nvidia was able to exercise strict quality control over monitors that shipped with G-Sync while AMD had no such leverage over monitor makers with Freesync.

        It's all water under the bridge at this point though. The technology has matured enough that G-Sync today is functionally just Nvidia's branding for Freesync and HDMI 2.1's VRR spec, and vendor lock-in is no longer really a thing except on a very small handful of monitors.

        5 votes
      2. [5]
        Chobbes
        Link Parent
        The Steam Deck doesn't support VRR. Do you mean that you can just change the refresh rate of the display to something arbitrary?

        The Steam Deck doesn't support VRR. Do you mean that you can just change the refresh rate of the display to something arbitrary?

        1 vote
        1. [4]
          teaearlgraycold
          Link Parent
          The Steam Deck does support VRR through Displayport out of its USB C port.

          The Steam Deck does support VRR through Displayport out of its USB C port.

          4 votes
          1. [2]
            Chobbes
            Link Parent
            Not on the internal display, though, which is why I was confused about it being brought up in the context of gaming monitors that support VRR.

            Not on the internal display, though, which is why I was confused about it being brought up in the context of gaming monitors that support VRR.

            6 votes
            1. teaearlgraycold
              Link Parent
              Yeah it’s more of a “fun fact” than anything useful.

              Yeah it’s more of a “fun fact” than anything useful.

          2. babypuncher
            Link Parent
            Yeah but i would presume somebody buying a steam deck is primarily using its internal display. It's unfortunate that the new OLED model still didn't fix this oversight.

            Yeah but i would presume somebody buying a steam deck is primarily using its internal display. It's unfortunate that the new OLED model still didn't fix this oversight.

            5 votes
  3. reckoner
    Link
    To me it was one of those things that it was hard to tell at first but then became noticeable once removed. Frame rates over 60 are similar for me. I didn't notice at first, but then taught myself...

    To me it was one of those things that it was hard to tell at first but then became noticeable once removed. Frame rates over 60 are similar for me. I didn't notice at first, but then taught myself to see it and applied the solution. Practically speaking, is that an improvement? Debatable.

    I can definitely see the difference when looking at the pendulum demo. Maybe try it without sync and see if it bothers you. If not, why bother.
    https://www.nvidia.com/en-us/geforce/community/demos/

    2 votes
  4. babypuncher
    (edited )
    Link
    I've been using VRR since some of the earliest G-Sync monitors came out in 2014. I think it is the biggest singular improvement in display technologies for gaming monitors of the last 10 years....

    I've been using VRR since some of the earliest G-Sync monitors came out in 2014. I think it is the biggest singular improvement in display technologies for gaming monitors of the last 10 years. OLED gaming monitors are finally a thing and definitely have the potential to be equally game changing, but first gen products still leave a bit to be desired.

    I simply cannot go back to fixed refresh rate displays, to the point where it actually ruins my experience when I run into the rare game that doesn't play nice with VRR (looking at you, Outer Wilds). I've been ecstatic about consoles and consumer TVs finally supporting this in the last few years, as it makes performance modes that don't get a perfect locked 60 FPS feel so much better.

    It's not just about covering performance shortfalls either. It lets me run games at odd framerates when it best suits them. A good example is Doom 3, which is capped at the unusual framerate of 62.5 FPS. The way this game implements that cap makes it have these occasional "jumps" when you play v-synced to 60 hz, and turning off v-sync on a fixed refresh display introduces some really bad screen tearing. But with VRR, my panel just refreshes at the games natural 62.5 FPS with no tearing, vsync lag, or stuttering.

    As for the differences between G-Sync and Freesync, they are pretty moot at this point. Freesync used to be a bit of a warning sign and require additional research; displays would ship with a Freesync sticker but often only support a very limited VRR range like 48-60hz, even when the monitor supported high fixed refresh rates like 120hz. With a G-Sync monitor, you were always guaranteed the full range from 30 hz up to the maximum supported refresh rate of the panel, with the GPU providing LFC below 30 FPS. The Freesync market has improved considerably as the technology has matured. Especially since Nvidia started certifying many Freesync monitors as "G-Sync compatible", which guarantees these performance characteristics.

    2 votes
  5. Akir
    Link
    One of the reasons why we bought our big expensive TV was because it had fairly good VRR support. And I'll tell you that the difference was astonishing! Astonishingly imperceptable. I guess it...

    One of the reasons why we bought our big expensive TV was because it had fairly good VRR support. And I'll tell you that the difference was astonishing!

    Astonishingly imperceptable.

    I guess it might make a difference if I was playing twitchy competitive games, but I generally don't like those. The games that I play with framepacing problems are issues with the software, not the hardware. The only way to fix them would be patches which are probably not going to come given the games I play.

    I'm getting old now, I guess, but it seems to me that the last two or three decades of major industry-shattering image processing techniques are all barely noticable. The last one that actually had a big effect on me was probably SSAO. Realtime raytracing was exciting because it's something I was dreaming of since the 90s but there are so few games that take very good advantage of it and for the most part they're just adding detail to regular raster graphics. DLSS is a great innovation but all it really does is make games perform better with less expensive cards - it's a good thing for sure, but it's not really a visual improvement per se.

    My gaming rig was fairly powerful when I put it together, but nowadays it's just collecting dust in favor of my Steam Deck. Chasing the edge does not make you happy. Spend your money on games instead.

    1 vote
  6. canekicker
    Link
    Some have already mentioned this but while I've noticed a difference between having VRR on vs off, the most obvious difference is when it's off. Even a game that pushes a higher but unstable...

    Some have already mentioned this but while I've noticed a difference between having VRR on vs off, the most obvious difference is when it's off. Even a game that pushes a higher but unstable refresh rate w/o VRR feels far worse (and when I say feel, i mean makes me queasy) than a game that has VRR on but runs on a lower frame rate on average. To me it's invaluable for gaming.

    On an unrelated side note, for those who have the opportunity to work from home, monitors with VRR tend to be high refresh rate and there is something incredibly satisfying working with Excel on monitors at a high refresh rate.

    1 vote
  7. Thomas-C
    Link
    I think for me it's that I'd prefer having either over neither. I think gsync does work out a bit more consistently. But freesync is fine if that's what's available. At least when I'm putting a...

    I think for me it's that I'd prefer having either over neither. I think gsync does work out a bit more consistently. But freesync is fine if that's what's available. At least when I'm putting a system together, I'll make sure one or the other is there, but gsync/freesync availability isn't something that drives a lot of my choices.

  8. vczf
    Link
    I mostly play Rocket League. G-Sync on makes a big difference in fluidity at high framerates (165Hz capped to 160 FPS). I wouldn't want to go back to playing without VRR....

    I mostly play Rocket League. G-Sync on makes a big difference in fluidity at high framerates (165Hz capped to 160 FPS). I wouldn't want to go back to playing without VRR.

    https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14/

  9. Shogun
    Link
    I bought a cheap LG UW Freesync monitor a few years back. Didn't realize it at the time but Freesync on Nvidia cards only works with Display port. The monitor I got only has HDMI so I've yet to...

    I bought a cheap LG UW Freesync monitor a few years back. Didn't realize it at the time but Freesync on Nvidia cards only works with Display port. The monitor I got only has HDMI so I've yet to try things out. Once I upgrade my 2060 I'll get a new monitor.

    On the other hand I hauled my CRT monitor out and hooked it up and the motion clarity on that can be amazing. It'll be interesting to see if gsync/Freesync can match it.