Would you rather play this game capped at a smooth 30fps, or with variable framerates of 35-45fps?

+
Would you rather play this game capped at a smooth 30fps, or with variable framerates of 35-45fps?

This is assuming screen tear is not an issue, but rather you're looking for the smoothest experience of the two (more responsive controls, less slowdowns and hitching etc).

Also, second assumption is that you're playing with KB+M, and not a controller. (And on that, turning the hardware mouse cursor on)

To make it clear, the frame rates fluctuate continually. Merely standing still in the forest or swamps can take my framerate from 34 to 38 and back to 35 in under a second. Facing one direction in the city can be 43fps, turning back round takes it to down to 34 in an instant. So given this, would you still persist with 35-45 or go for a 30fps lock?
 
since you're playing on PC ...

Only stable 60fps is acceptable!
If you happen to have a FreeSync / Gsync Monitor everything above 30 should be fine...
 
If tearing doesn't bother you then unlocked. Personally I'd say locked as I can't stand tearing, this beautifully rendered world with lines cut through it with offset frames either side.

As Metaljesus said: gsync/freesync is the way to go. I have a gsync monitor and framerates from 90+ when sailing the seas down to 50s in some more intense parts and it always looks great.
 
Last edited:
If tearing doesn't bother you then unlocked. Personally I'd say locked as I can't stand tearing, this beautifully rendered world with lines cut through it with offset frames either side.

As Metaljesus said: gsync/freesync is the way to go. I have a gsync monitor and framerates from 90+ when sailing the seas down to 50s in some more intense parts and it always looks great.

Frame rate lock isn't vsync it can still tear.
 

Guest 3819801

Guest
I'd go with 30 fps lock. A few years ago I'd go with fluctuating fps. I didn't notice/didn't care about screen tearing back then. But now I can't stand it.
Metaljesus since he is on a pc everything is acceptable. Because everything is configurable on pc.

But yeah, gsync monitor is also a good way to play games. :)
 
Capped framerates at 30 fps never feel smooth at least not for me, so I would not play this or any other game capped at 30fps.
 
IMO, terms like "smooth 30fps" or "rock-solid 30fps" are just lies people tell themselves to believe they're having a great experience. I always aim for stable 60 fps, no matter how much I have to lower graphics settings.
 
IMO, terms like "smooth 30fps" or "rock-solid 30fps" are just lies people tell themselves to believe they're having a great experience. I always aim for stable 60 fps, no matter how much I have to lower graphics settings.

Hmm, I meant smooth as in "constant". I fully acknowledge 30fps is vastly sub-par compared to 60fps.

The question here is, I cannot reach 60fps, so I have to make a decision based on the two choices presented above.
 
Hmm, I meant smooth as in "constant". I fully acknowledge 30fps is vastly sub-par compared to 60fps.

The question here is, I cannot reach 60fps, so I have to make a decision based on the two choices presented above.

Isn't there anything else you can do to achieve a higher framerate? Not even lowering resolution? I'd do that before going 30fps
 
Frame rate lock isn't vsync it can still tear.

How so? If you mean using the in-game frame limiter then that may be the case, but a proper 30hz vsync will not tear.

Vsync locks the frame rate to the refresh rate of the display. It eliminates tearing by either updating the screen at it's full refresh rate or (when the system cant update that fast) at an even division of that rate (1, 1/2, 1/3, aka 60,30,20). You get the negatives of cutting between those fixed rates (stutter), but no tearing.

Enabling adaptive vsync (half refresh) in nvidia control panel is effectively vsync but with the system doubling every frame. So as good at eliminating tearing as 60hz vsync but at 30fps. I'd favour this over variable 30-45fps on a fixed refresh display for this reason.
 
The human eye detects "stable" fps as smooth, not "high" fps. 30 fps will seem choppy at first, especially when you look at the distant objects...which will appear to flicker or stutter across the screen when you pan the mouse around. This will be even more noticeable if you are used to playing games at 60 fps or higher.

However -- the human eye adjusts to this over a short period of time (about as much time as it takes your eyes to adjust to the dark when you switch off the lights). 30 fps is actually quite smooth (most film/TV you've watched in your life has been filmed at 26-28 frames per second.) Give your eye enough time to adjust and the "awful" performance suddenly becomes...not too bad, really.

60 fps is much smoother, and you will notice the difference. But if you cannot maintain a 10 fps range (i.e.: 50-60fps), your eye will instantly pick up on this sudden drop in framerate, and the game will appear to "stutter". Therefore, to achieve smooth gameplay, head to Novigraad during a storm and run around with graphics as high as you want. Figure out what your lowest fps is, then set the frame limit to just below that value. Or lower the graphics settings and try again. This will ensure the game runs at a constant fps in all areas, and it will feel much smoother as you play.

Frames per second above 60 become largely inconsequential. The human eye loses the ability to detect fps beyond the 100-120 range. Most people cannot tell the difference between 80 and 120, anyway.

For users with minimum to mid-range hardware, try my suggestions above and limit your framerate accordingly. (Be patient after adjusting the game and give your eyes a chance to adjust.) Not only will most people find that the game feels much smoother after about an hour of gameplay -- you'll likely be able to max most of your graphics settings and still maintain stable fps.
 
Set framerate to unlimited, turn Vsync off on the in game settings then through the Nvidia Control Panel or CCC if using AMD and set Witcher 3 to Vsync on in there. That will give you the best results on smoothness without screen tearing.

If you set the in game frame rate cap to on, you end up with screen tearing.
 
Set framerate to unlimited, turn Vsync off on the in game settings then through the Nvidia Control Panel or CCC if using AMD and set Witcher 3 to Vsync on in there. That will give you the best results on smoothness without screen tearing.

If you set the in game frame rate cap to on, you end up with screen tearing.

People should try this!

Unfortunately, it does not work on a lot of rigs. Like mine. I must run in Fullscreen, in-game vsync on, in-game frame limit at 60fps, and my monitor refresh set to 60 to achieve silky smooth fps everywhere. If I change any of those options or attempt to use Nvidia's vsync -- stutter central. Unplayable.

And I'm running an i7-4790K, GTX 980 ti 6GB, and 16 GB DDR4 2133MHz RAM.

There's still some optimization left to do with the graphics engine.
 
People should try this!

Unfortunately, it does not work on a lot of rigs. Like mine. I must run in Fullscreen, in-game vsync on, in-game frame limit at 60fps, and my monitor refresh set to 60 to achieve silky smooth fps everywhere. If I change any of those options or attempt to use Nvidia's vsync -- stutter central. Unplayable.

And I'm running an i7-4790K, GTX 980 ti 6GB, and 16 GB DDR4 2133MHz RAM.

There's still some optimization left to do with the graphics engine.

Yeah, should have mentioned that it's not the same for everyone but for some set-ups (particularly mutli-gpu set-ups) it's way better than using the in-game settings.

Had both tearing and stuttering prior to shutting off the in game frame limit and utilizing a secondary vsync on Nvidia control panel. Now I get a nice smooth 60 fps @ 4k. (overkill rig though, dual Titan-X OC'ed to 1500 mhz, 5930K OC at 4.6, and 32GB ram @ 2666).
 
30 fps is actually quite smooth (most film/TV you've watched in your life has been filmed at 26-28 frames per second.)

Please! It only appears to be "smooth" because there is motion blur in those films and shows you've seen. Graphics on computers are NOT rendered with blur. Each frame is a perfect frame and thus the eye and brain are not tricked into feeling this "smoothness". Seriously, try setting a maximum framerate of 24 in a game and try panning the camera around and walking in the world. You will feel extremely uncomfortable and some people will even feel eye strain. How do movies get away with 24 FPS? That motion blur.

Motion blur in games only masks poor performance. We are living the experience through a window. Blurring that experience is ludicrous in most situations, which is why a lot of people hate motion blur and depth of field blur. Additionally, motion blur reduces your performance in games because it is a taxing effect to render.

All your other points are clear and speak truth. You really know what you are talking about but had a little trouble with this one part of your post. I hope I was of help to you.
 
Please! It only appears to be "smooth" because there is motion blur in those films and shows you've seen. Graphics on computers are NOT rendered with blur. Each frame is a perfect frame and thus the eye and brain are not tricked into feeling this "smoothness". Seriously, try setting a maximum framerate of 24 in a game and try panning the camera around and walking in the world. You will feel extremely uncomfortable and some people will even feel eye strain. How do movies get away with 24 FPS? That motion blur.

Motion blur in games only masks poor performance. We are living the experience through a window. Blurring that experience is ludicrous in most situations, which is why a lot of people hate motion blur and depth of field blur. Additionally, motion blur reduces your performance in games because it is a taxing effect to render.

All your other points are clear and speak truth. You really know what you are talking about but had a little trouble with this one part of your post. I hope I was of help to you.

Explain then why gameplay videos at 30fps on YouTube look decent enough?
 
IMO, terms like "smooth 30fps" or "rock-solid 30fps" are just lies people tell themselves to believe they're having a great experience.

This comes across as ignorant. Perhaps people simply have no issue with 30 fps?

I'll take 30 fps/max graphics over 60 fps/lower graphics any day.
 
Last edited:
Please! It only appears to be "smooth" because there is motion blur in those films and shows you've seen. Graphics on computers are NOT rendered with blur. Each frame is a perfect frame and thus the eye and brain are not tricked into feeling this "smoothness". Seriously, try setting a maximum framerate of 24 in a game and try panning the camera around and walking in the world. You will feel extremely uncomfortable and some people will even feel eye strain. How do movies get away with 24 FPS? That motion blur.

Motion blur in games only masks poor performance. We are living the experience through a window. Blurring that experience is ludicrous in most situations, which is why a lot of people hate motion blur and depth of field blur. Additionally, motion blur reduces your performance in games because it is a taxing effect to render.

All your other points are clear and speak truth. You really know what you are talking about but had a little trouble with this one part of your post. I hope I was of help to you.

There's no trouble with the post, and you've correctly identified why older games seem so very smooth: blur. This is why console games began purposefully adding rendering effects like "motion blur" or "ghosting" all the way back in the N64 days. Especially on a CRT monitor or vacuum tube TV, it looks great.

Blur on film, however is a double-edged sword. It does mask the "stutter" with sharp camera movements, but it also limits how quickly you can move the camera while still capturing a clear image. In truth, camera operators go to great lengths to ensure that there is no blur in their moving shots. You're watching a steady 24-28 fps. Whenever a shot is needed requiring a lot of motion, like for action shots, camera operators often up their speed to 32 fps (which also mucks with your colors and requires an annoying post process, but I digress). It's the lack of "hard edges" and the natural DoF on film that allow the eye to view playback as perfectly smooth.

24 fps on a monitor, using frame-by-frame rendering, would look choppy, but 30 is quite nice. Add a bit of blur, and it's really quite smooth. Even 24 fps is not fatiguing on the eyes as long as the blur is high enough. Again, you have to let the eye adjust. Our brain has this funny little thing it does concerning any visual input it receives -- it adapts. It's only when things drop below 20 fps that the eye can clearly pick up on each frame in turn, and the motion seems interrupted.

Modern audiences struggle a bit because their eyes have become accustomed to 60 fps +. It will take about 10 minutes for the eye to view 30 fps as smooth (even without motion blur); it will take about an hour for the brain to forget what 60 fps was like. 30 will never look as smooth as 60, but it will no longer seem choppy.

Actually, if you play at 30 fps for long, looking at 60 fps again will be what seems unnatural: as if the images are "sliding around on the screen" instead of moving naturally.
 
Last edited:
Ok, assuming you play at 30fps but no blur (god, I hate blur, and it seems quite exaggerated in this game), is it "bad"? Or still OK?

What Sigil Fey says at the top of Page 2 is very true...given enough time, I can actually play just fine with 30fps. All it needs is for the eyes and brains to adjust a bit to the lower framerate.

Of course, 60fps is always superior to 30fps, but I think in TW3 at least, 30fps isn't really that bad. Maybe in a fast paced shooter like Wolfenstein: THe New Order 30fps would suck and hamper your experience, but here it is still playable enough at the framerate.

Now, to further ask a question...will your eyes get "bad" if you play games at 30fps (w/o blur) as opposed to 60fps? "Bad" as in the power of your glasses increases. Is there a relation between framerate and eye degregation? What about resolution? Is 1080p30 considered to be acceptable enough that it doesn't adversely impact your eyesight relative to 1080p60?

Thanks!
 
Top Bottom