It all comes to one basic question - have you ever seen constant 60fps content?
I`m accustomed to watching stable 60 fps at video games.
The movie played by media player in 60fps that has real 60fps only in couple scenes is no good for me.
The 30fps movie bumped 2x by svp to 60fps IS NOT 60 FPS MOVIE.
The movie file is recorded in 30 fpses but the actual movie gets serious fps hit whenever:
a) the camera moves,
b) many objects move in front of the camera,
c) there is some quick change, like an explosion or building crumble to milions pieces in moment.
d) the camera is turning while filming (like turn vertically, horizontally).
e) (my favorite) all those things together.
example 1 - you get 60fps movie,
but if there is a scene in the movie that was shot in 5fps, you`ll get 2 x 5 = 10 fps.
10fps is not a movie - 10fps is total recall balcony scene. I watched it in imax and was shocked how bad it looked.
The image gets choppy or blurred, not silky smooth as we deserve.
edit: example 2 - what happens with 100fps movie with my settings on 60Hz monitor?
(I`m no expert - that`s the best explanation I`ve came to)
1. Media player wants to display 30 fps movie.
2. svp captures the movie and adds the frames.
3. the movie is set to display in 120 fps.
4. The scenes that are 60fps fluent will get cut to 60 fps (30fps => 120fps => 60fps) - no gain here - all perfect already.
5. The scenes that have been bad to begin with will be bumped and then cut with no loss ex. (5fps=>20fps=>20fps) - huge gain where it matters.