I suggest to go to cinema and watch any movie in 24 fps than come back to you computer and check again if fps is what makes difference.
40 or 50 fps difference is none except statistically driven orgasmic feeling.
Except a 24fps movie will be 24fps whether you view it in the cinema or at home on a computer screen, it doesn't change when you view it on a 60Hz display.
They're also MOVIES not games. The way the framerates are handled between them are very different.
Movies are capture from real life using a variety of different cameras, that capture light photons through a lens and save them as frames, either on film or to a piece of digital media in more modern cameras.
The effect of this means that movies get their frames blended together naturally with motion blur because of how light is captured due to lens exposure times.
Video games have to render the frames instead.
They don't get the benefit of this motion blur. Instead they have to rely on post process effects to make motion blur, which is still terrible because it has to fit within a render time of 16ms/33ms per frame in addition to everything else that has to be rendered. So all we get is terrible looking motion blur. We're only now starting to get decent stuff like per-object motion blur that more closely mimics what cameras do thanks to the new consoles finally being above toaster tier..
Please don't tell me it's "a placebo" or a "statistically driven orgasmic feeling", because it's bullshit.
Or that there's no difference between 40 or 50 or 60fps. Because there is.
I don't need any framerate counter at the corner of my screen, nor do most other people that know what they're talking about, to see when a game isn't running at 60fps but rather at 50 or 40.
Stuff like GSync and FreeSync is being made for a reason, and not to achieve some statistical orgasm.