Re: A Better/Faster/Smoother Solution to MPC+Madvr & PotPlayer
OK it appears I was still running on the Intel HD integrated card even though I configured it... now with the Radeon HD 7670M, I'm still getting a black screen.
You are not logged in. Please login or register.
SmoothVideo Project → Using SVP → A Better/Faster/Smoother Solution to MPC+Madvr & PotPlayer
OK it appears I was still running on the Intel HD integrated card even though I configured it... now with the Radeon HD 7670M, I'm still getting a black screen.
To unleash the potential of MPDN (I'm seeing way more detail in film than I've ever seen before), you have to use a render script. MPDN's author posts two easy options here:
http://forum.doom9.org/showpost.php?p=1 … tcount=322The options are:
(1) Pre-/Post processing with NEDI
(2) Pre-/Post processing with SuperResChroma + SuperRes with NEDIThe first should be doable by a mid-range video card, whereas the second nearly maxes out my GTX970. Both, for me, were great improvements over the upscaling I achieved using Madvr with other players.
Other than that, I find the dithering function too noisy so I shut it off.
Glad you've fiddled around with it and managed to push SuperChromaRes + SuperRes with Nedi using your GTX970... I could never pull that off with my 7970 OC. Question, at what FPS are you running your videos? and if you run MSI-AfterBurner (not telling you to run, just if you run it) can you let me know what your GPU usage is?
Do you upscale 1080 to higher resolutions? or are you running on a 1080p TV/Monitor? and don't do up-scaling?
Edit: I'm asking because I'm thinking of getting a GTX970 or 980. The Titan X is way out of my budget, might wait for AMD to release the 300 Series.
Hi, I'm running SVP at 60fps as that's the max I can get out of my 60fps 1080p 37" LG TV screen. (The screen is usually about 1 meter from my face so artifacts, halos, blurring, noise, etc. are readily detectable.)
Don't know if it matters, but I can't get the TV to work properly via HDMI cable with video cards (though it works fine via HDMI with tablets), so I'm doing everything through an RGB cable.
Installed Afterburner out of curiosity. Didn't touch any settings.
For 1080p material: material with average rendering time of 9ms showed GPU usage at 50%. With rendering time of 15ms showed GPU usage at 82%.
Same material with Madvr at 9ms rendering time showed GPU usage at 77% and at 16ms average rendering time GPU usage was 90%.
For 720p material: material with average rendering time of 12.5 ms showed GPU at 70%.
With Madvr, 83%.
BTW, I got Madvr to work better in the interim. So the difference between MPDN and Madvr is not quite so striking. Still MPDN scores noticeably better on 1080p and sometimes especially better when upscaling 720p. But Madvr produces a reliably clean picture, whereas in some cases MPDN can be noisy. Having said that, for me MPDN is the clear winner when it comes to picture quality.
biff wrote:To unleash the potential of MPDN (I'm seeing way more detail in film than I've ever seen before), you have to use a render script. MPDN's author posts two easy options here:
http://forum.doom9.org/showpost.php?p=1 … tcount=322The options are:
(1) Pre-/Post processing with NEDI
(2) Pre-/Post processing with SuperResChroma + SuperRes with NEDIThe first should be doable by a mid-range video card, whereas the second nearly maxes out my GTX970. Both, for me, were great improvements over the upscaling I achieved using Madvr with other players.
Other than that, I find the dithering function too noisy so I shut it off.
Glad you've fiddled around with it and managed to push SuperChromaRes + SuperRes with Nedi using your GTX970... I could never pull that off with my 7970 OC. Question, at what FPS are you running your videos? and if you run MSI-AfterBurner (not telling you to run, just if you run it) can you let me know what your GPU usage is?
Do you upscale 1080 to higher resolutions? or are you running on a 1080p TV/Monitor? and don't do up-scaling?
Edit: I'm asking because I'm thinking of getting a GTX970 or 980. The Titan X is way out of my budget, might wait for AMD to release the 300 Series.
Go with the 980, and I am saying that based on my personal experience with AMD, I found they have a tendency to run too hot when pushed too hard. That may be fine for an AMD card, but all that heat transfers into the case where other components have to deal with it.
The 980 has 2048 cores as apposed to the 970's 1664 cores _ more guts to run things.
Go with the 980, and I am saying that based on my personal experience with AMD, I found they have a tendency to run too hot when pushed too hard. That may be fine for an AMD card, but all that heat transfers into the case where other components have to deal with it.
I must disagree with this for a few reasons...
1. AMD does not have any more tendency to run hot than Nvidia does. AMD's 3000, 4000, and 5000, and 6000 series GPUs all ran cooler and were less power-hungry than Nvidia's flagship GPUs at the time (9000 series up until the 500 series); note that I'm excluding dual-GPU models for obvious reasons.
2. Temperature and heat output are not the same thing - more heat output can give higher temperatures, but higher temperature doesn't necessarily mean higher heat output. This is important because the current Radeon 200 series has quite high transistor density, which gives higher temperatures but doesn't mean higher heat output - this is the same case with Ivy Bridge and Haswell CPUs.
3. The Radeon 200 series reference air cooler was crappy. Models with the stock cooler are still available, but you'd be an idiot to buy them since there's literally only 2 models of for example the 290 on Newegg compared to like 70 models with non-reference coolers, and those 2 reference models are not the cheapest models either. Nevertheless, most reviews used the stock cooler since the non-reference models weren't available back in the day, so for most people the temperatures in said reviews aren't going to be accurate unless the review specifically uses a card with an after-market cooler.
4. Radeon GPUs have a user-selectable maximum temperature setting in the drivers that is set to 95c by default for whatever reason, and users that don't want it as hot merely have to turn the setting down.
5. It would be silly to buy any GPU right now - we know the 300 series is coming this quarter. Even if you buy an Nvidia GPU, you would still be able to take advantage of the according price-drops.
DISCLAIMER: My current PC uses an Intel integrated GPU and my (currently unplugged) discrete GPU is an old-ish Geforce 8800GS.
More info for Blackfyre to make an informative decision.
I was messing around with an application called DmitriRender, which solely runs on the GPU.
At the time I had an AMD MSI R9-290, I was testing Power DVD 14's Fluid Motion Video and didn't like it. I tried the card with DmitriRender and the temps were reaching around 85 degrees, sent that back and got a 980 and the fans barely come on at 65 degrees.
http://www.ncix.com/detail/msi-radeon-r … -92891.htm
http://www.ncix.com/detail/evga-geforce … 102571.htm (the one I have now)
This is just personal preference, but I never did like AMD's control panel compared to NVIDIA, I was never a fan of AMD because of that.
Did you have a model with the reference air cooler? They look like this:
http://cdn5.thinkcomputers.org/wp-conte … -290-1.jpg
FYI, AMD's been basically using the same stock cooler since the 5000 series, which is no wonder it's as crappy as it is - it'd be like using a reference Nvidia non-Titan cooler on the Titan X. This is very likely why there are literally only 2 SKUs available that use said cooler.
Just posted some links for you.
That reference cooler is lousy _ there was a time when I was even planning to water cool and I have the bits to water cool the CPU to run SVP overclocked, but I'm still waiting for my "Be Quiet" fans from NCIX.
You guys are really crazy
"Lets buy GTX980 for the some unknown "SuperResChroma", I've no idea what is that but is sounds great!'
http://wiki.mikejung.biz/MadVR_Chroma_U … ge_Quality
I don't disagree that the Geforce 900 series is indeed less power-hungry than the Radeon 200 series, but to AMD's credit their GPUs are a year older than Nvidia's currently.
You guys are really crazy
"Lets buy GTX980 for the some unknown "SuperResChroma", I've no idea what is that but is sounds great!'
http://wiki.mikejung.biz/MadVR_Chroma_U … ge_Quality
I originally bought mine to run DmitiRender, which is extremely power hungry on the GPU.
I don't use it anymore though _ I discovered too late that it had too many annoying artifacts.
I don't regret spending the money on it though...
And your link Chainik, I can't see any differences in those screen shots at all _ I don't know what I'm looking for, but it must quite subtle.
Well when did you buy it though? Personally if I bought one now only for it to cost 40% to 50% less in 3 months, I'd be pretty annoyed with myself.
Then again, I'm someone that used a lowly AMD E-350 for 5 months after my AM2+ mobo died until I got the MicroCenter $90 (after rebate) G3258 + mobo bundle, and that was largely because my friend was giving me some DDR3 RAM for free (one module was bad but it had a lifetime warranty - he just didn't want to hassle with an RMA).
Jeff R 1
I don't regret spending the money on it though...
so you were just waiting (searching?) for a "good enough" reason to buy it
Jeff R 1
I don't regret spending the money on it though...so you were just waiting (searching?) for a "good enough" reason to buy it
He He He !!!
Maybe, but in the past, that I've always wasted money buying low end components only having to upgrade later on.
So I just jumped in for the whole ball of wax.
I can run DmitriRender at the same time with SVP _ I can use the settings in SVP to control the artifact problem in DmitriRender although I am just using SVP right now.
low end components
Well for one thing, what's your definition of 'low end'? Some people would consider my current Pentium G3258 to be 'low end', even with my current WIP overclock of 4.1GHz.
I have a GTX 760 that was throttling with DmitriRender at 85 degrees.
I added an after market water cooler to that which kept the chip cool, but the MOSFETS were boiling hot.
I made up some custom cooling fins and put a fan directly on them to keep the temps under control.
It was a Zotac card with an after cooling fan on it
Money well wasted there.
Before that there was a GTX 610, that wouldn't run a dam thing _ I was able to return that one.
Well for one thing, Kepler isn't all that great at GPU-compute...heck at times it's worse than Fermi.
That wouldn't by chance be the reason you had gotten that 290, would it? Because indeed it certainly has good GPU-compute performance.
Yup
I agree with you there, the whole card and water cooler is stowed away in a box...
I got the 290, to solely run the AMD Motion Flow thing with Cyberlinks Power DVD 14 app _ waste of time and money.
And your link Chainik, I can't see any differences in those screen shots at all _ I don't know what I'm looking for, but it must quite subtle.
You have to look at it blind-folded to sense the difference with your third eye, young jedi
Jeff R 1 wrote:And your link Chainik, I can't see any differences in those screen shots at all _ I don't know what I'm looking for, but it must quite subtle.
You have to look at it blind-folded to sense the difference with your third eye, young jedi
Ha Ha !!! Young Jedi... I just turned 50 !
I'll have to look at it on a better screen, I'm just on a 720p 15 inch laptop screen right now.
I have a 1080p HP monitor and off course my Panasonic 4000 projector, maybe then.
One thing I did notice are the black levels, they're raised a bit with madVR _ I especially noticed this with the projector when I pick madVR as "Output" in MPC-HC.
And I don't like that, it looks a bit washed out.
Hi, with all this settings of the the new player and the filters, i noticed a better quality of the frame interpolation. Is there a possibility to create a script for Megui to render this settings to a new file with x264.exe?
Jeff R 1
And your link Chainik, I can't see any differences in those screen shots at all
exactly
There is no real benefit to using NNEDI3 for Chroma Upscaling since I can barely notice a difference in quality, yet NNEDI3 is about 5 times more expensive than just sticking with the madvr default, bicubic.
Well technically the likes of lanczos and bicubic aren't really that great of upscaling algorithms...
Hi, with all this settings of the the new player and the filters, i noticed a better quality of the frame interpolation. Is there a possibility to create a script for Megui to render this settings to a new file with x264.exe?
First we'd need an explanation as to where the increased quality is coming from! And some kind of proof to show that it's not placebo.
MPDN is working with script chains... is it using AviSynth behind the scene or it handling videos in a completely different way? The way it handles script chains does seem similar to AviSynth.
Someone would need to screen-capture playback with MPC-HC and MPDN so that playback can be properly analyzed. Camtasia Studio does a good job at recording videos at up to 30fps; I don't know if there are other free alternatives that allow proper screen recording.
CamStudio comes to mind.
SmoothVideo Project → Using SVP → A Better/Faster/Smoother Solution to MPC+Madvr & PotPlayer
Powered by PunBB, supported by Informer Technologies, Inc.