Huge smoothness increase(involving Vsync)

Discussion in 'Player Support' started by BlackDove, Jan 28, 2014.

  1. BlackDove

    I discovered that I've been doing something wrong for a long time.

    I've had adaptive Vsync on with triple buffering, in my Nvidia control panel, like it should be. It's impossible for me to play anything without Vsync on.

    However, I also had it selected in a lot of games, thinking that it didn't make a difference if it was or wasn't in the games, as long as it was in the drivers.

    It does make a huge difference though!

    I disabled Vsync in a bunch of games that have the option selected, and left it on in the Nvidia control panel(adaptive, triple buffered), and the difference is incredible.

    I'm not sure if this affects all cards(I have a 660Ti), but it's at least worth a try. PS2 still hiccups and lags, but this makes the times when it's not doing that much smoother!

    It also makes every other game that I had it enabled in a million times smoother.

    Basically, enable Vsync(with triple buffering, and adaptive if available) in your drivers, and disable it in your games. Having it on, on the driver level and off in the applications, makes it infinitely smoother(at least for my configuration of hardware).

    Let me know your results!
  2. sicsoo

    i have no problem with ps2 vsync by the game. i had an amd card before, 6870, with active vsync the game needed something like D3D overrider to enable triple buffering. with an nvidia card, it doesn't, the game itself activates triple buffering and is ok (i play mostly big battles at 40fps, vsync enabled and no hiccups).

    i don't find any difference with the vsync by control panel (the real vsync, not the adaptive, the adaptive works different and is not useful at all for the tearing) and ingame ps2 vsync.
  3. BlackDove

    If you want to see what I'm talking about, enable adaptive Vsync in the drivers, then enable and disable Vsync in the game, run around and fly something, and see if you can tell a difference. I can personally notice a huge difference. This doesn't fix the places where PS2 itself glitches and just hiccups, but it fixes the smoothness of the motion, when it's not hiccupping.

    This isn't just PS2 either. I had Vsync enabled in the settings of most of my games, as well as in the drivers, thinking it wouldn't do anything to have both enabled. When I disabled it in game(but left it on in the drivers), it made motion look much smoother.

    Personally, I prefer the occasional tearing from adaptive Vsync basically turning Vsync off when it drops below 60fps to the judder/stutter you get from any Vsync setup, triple or double buffered, when it drops below the refresh rate and has to rescan one of the frames. http://www.pcper.com/news/Graphics-Cards/PCPer-Live-NVIDIA-G-Sync-Discussion-Tom-Petersen-QA There's a lot of good explanations as well as slow motion capture of the judder I'm talking about at around 30 minutes into that video.

    Ideally, we'd all be using G-Sync with low persistence OLED monitors, but until then I'll use adaptive Vsync, since it gives much more natural motion than double or triple buffering(although the tearing when it drops below 60 sucks).

    Like I said, I'm not sure if all GPU's are affected, but I know that 600 and 700 series GPU's are very common on this forum, so anyone who has adaptive Vsync on in the drivers to get the best of perfect Vsync when you're at your refresh rate, without judder or stutter when it drops below the refresh rate, will definitely want to try this.
  4. JonboyX

    Another tip is that running in Windowed mode costs you about 10fps. I tried that once to monitor Mumble on the desktop, and quickly switched it back to full screen.
  5. sicsoo

    i don't understand where's the point of enabling triple buffering by nvidia control panel. it works only on opengl application, i don't know if they changed it recently but so far it doesn't do nothing with games.
    only thing i know is, adaptive vsync gives tearing. yes it "maybe" looks smoother and choppyness free, but you get tearing, and if you're not okay with sliced frames, you need to enable normal vsync. and no need to enable triple buffering, since the game control pre-rendered frames itself (there was a post by a developer about this).
  6. BlackDove

    I should have been more clear about the triple buffering thing.

    Personally, switching from the in game Vsync to only using the driver level Vsync, made a night and day difference for me.

    I personally don't mind a couple of horizontal tears when the choice is:

    A: a couple horizontal tears

    B:the whole image judders because the GPU rescans the same frame twice, thereby doubling its effective frametime.

    In a game with as many hiccups as this one, any improvement in smoothness is welcome.
  7. vulcan78

    I'm gonna try this again. I agree, having the a few horizontal tears is infinitely preferable to having the massive stuttering induced with conventional v-sync. I hope SOE gets around to addressing the lack of optimization with capable rigs, i.e, SLI and Inte multi-core CPU.
  8. acksbox

    Yes, the NVIDIA setting is only for OpenGL apps and though there are OpenGL games, PlanetSide 2 is not one of them (its DX9).

    Yes, adaptive vsync will still tear any time frame rate is less than refresh rate, though how much is noticeable will depend on the individual and the situation.

    Planetside 2 does indeed allow vsync without locking fps a whole number divider of refresh rate, rather like triple buffering.

    It's almost impossible to avoid stutter with any kind of vsync with a fixed refresh rate, unless you can always maintain a higher frame rate than your refresh rate, or you have such a high refresh rate display that some frames lasting 2-3x as long as others is not perceptible.

    For this reason, I generally leave vsync off, unless the titles are quite old.
  9. BlackDove

    With a modern midrange GPU, you can easily maintain more than 60fps in most games at 1920x1080.
  10. sicsoo

    well maybe you wanna try this, since it seems to work for me. if your monitor support 50hz at 1080p, use 50hz instead of 60hz by nvidia control panel. then activate normal vsync.

    try it.
  11. nitram1000

    For AMD users like myself, download RadeonPro and enable Dynamic Vsync Control, which is the same as Adaptive vsync from Nvidia, it really is incredibly smooth and helps reduce GPU temps by capping the frame rate to 120fps same as your monitor refresh rate.
  12. vulcan78

    If you don't have a 120 Hz monitor see if that program will let you select 60 FPS, expect even cooler temperatures.
  13. vulcan78

    I am actually going to try this, and for others also interested I did a little bit of research to find that you can switch to 50 Hz via Nvidia Control Panel and I am assuming a similar process with AMD.

    http://forum.notebookreview.com/sager-clevo/660283-you-guys-know-you-can-overclock-lcd.html
  14. BlackDove

    Why would you want your monitor to refresh 10Hz slower and use normal Vsync?
  15. sicsoo

    i don'0t know man, must be something with m/s and response time by the game/vsync/ pre render buffer etc etc, but im not a technical buy, all i know is i get much better smoothness with 50hz vsync than 60fps with some tearing.
    i play with all on ultra, many situations gives me around 50 or even less (40-45 too with shadows on ultra), so, if you have fps always around 60, adpative is a must (also for input lag), but if you go down easily, avoid tearing is best and i like normal vsync more.

    giving it a try, i didn't know about this, my monitor max is actually 74hz, so what i'm trying is to lock vsync by nvidia panel to 1/2 refresh rate. so i get 37 fps ingame, really smooth but there is a bit of input lag compared to 50 vsync.
    gonna play with it a bit, there is no stutter since it seems locked perfectly by the nvidia driver.
  16. YuriSensen


    So can I with a low end card, if I find the right one. It's less about the level of the card, and more about the actual card. You can find a $300, and find out it's total crap, compared to something around $200.
  17. acksbox

    Depends on the game and what settings you are going for.

    I have an R9 290X in my primary system and a GTX 780 in another. Neither will guarantee 60fps, all the time, in quite a few modern games (or Planetside 2), with maximum details at 1080p.
  18. vulcan78

    I tried setting the refresh rate to 50 Hz via Nvidia Control Panel and I didn't see any improvement, and it was less smooth during moments where 60 FPS is usually possible (Warpgate, outpost on Indar etc.)

    It may work for others but I saw no improvement. I should note that when the game says I am GPU necked I seldom am at over 45% utilization per GPU, this is a flagrant, serious optimization issue.
  19. BlackDove

    Yuri Sensen:

    https://forums.station.sony.com/ps2...e-can-i-run-it-upgrade-advice-threads.170564/

    You make it sound like it's a guessing game, but it's really not. Check my guide, specifically the GPU section. You need to know what to look for when you're buying a GPU. FLOPS and memory bandwidth are more important than the model number. All the information about which chips are good and which ones suck are easy to find with a few clicks.

    acksbox:

    You might want to check your drivers and settings. If you regularly drop below 60FPS in anything other than Metro Last Light, or something with a lot of super sampling, and you're GPU limited, you should assess why. Being CPU limited and dropping below 60FPS is easy enough in PS2. What CPU's do you have?

    The 780 is a good GPU, but the 290 has a lot of thermal issues and throttles pretty badly, as well as having inconsistencies, but you should still be able to get a fairly constant 60FPS with it. The 780 and 290 are really designed for 2560x1600 at 60Hz or 1920x1080 120/144Hz monitors.

    Vulcan and sicsoo:

    There is really no reason to play with a 50Hz refresh rate, as a lot of people will detect the individual refreshes, when they're that low.

    The reason you're seeing apparently smoother motion, is probably because you're using in game Vsync, rather than the driver level Vsync.

    The whole point of this thread was to increase the smoothness in PS2, and I know this doesn't work with all GPU's.

    However, if you have a desktop GPU(laptops with Optimus don't seem to work with this) that supports Adaptive Vsync, try turning it ON in the drivers, and OFF in all your games that the driver level Vsync works for(PS2 does).

    In the case of PS2, it made the motion infinitely smoother than the in-game Vsync does!
  20. vulcan78

    My GPU's (680M SLI) support adaptive v-sync, which I am using and agree is a little smoother than conventional, program based v-sync. I hope SOE gets their act together and addresses optimization for SLI (and X-Fire) and Intel multi-core CPU's. Right now the game runs like garbage.