@Xenocyde
@Blackfyre
I've (again) tested RIFE 17.v2 , v17.v2 Lite, 18.v2, 19.v2 and 20.v2, with my usual blend of test scenes, in slow motion (x0.25),
looking for usual things as distortions, warping of patterns and objects, artifacts, especially when camera panning etc.
My observations match yours.
1) For me18.v2 is by far the best model regarding all things mentioned.
2) 17.v2 second best model, but with noticeably increase in distortion and warping of patterns and objects, especially when panning.
3) Followed by third best 19.v2 with same issues.
4) Fourth best version 17.v2 Lite with the same artifacts.
5) 20.v2 the worst of them. Not sure what happened here during training. :-(
Especially watching in slowmotion x0.25, artifacts, distortions etc. can be observed with all models, but it's suprising how good 18v2 and 17v2 Lite are looking.
For 1080p@60 fps I keep using 17v2 Lite, and for everything up to
3840x1600 - 4K-UHD resolution, @48 fps, it's v18.v2.
@Asking a question out loud
Anyone having an explanation why even a nvidia rtx 4090 can best do 48 fps up to 3840x1600 - 4K-UHD resolution?
Increasingly, above 3840x1600 resolution @60 fps, playback keeps stuttering no matter what. Already removed other bottlenecks.
Observing hwinfo64 metrics for
- GPU-utilization
- power draw etc.
- task manager
there is a strange discrepancy between all those.
Up to 3840x1600 resolution @48 fps,
- GPU-utilization keeps around 40 - 60 %
- power draw hovers around 170 - 240 watts
- Task Manager: Tensor Cores, 3D, Video Encode & Decode and Copy also far from being fully utilized.
Then, getting close to 3840x1600 resolution @60 fps and above, a sudden disproportionate jumps happens and stuttering keeps occuring.
The ~ 25 % increase in resolution does not warrant the overall increase ... is the GPU bottlenecked by the Tensor Cores? Not enough of them?