Topic: Default setting = highest setting?

I'm using 2500K@4.5G with ATI 4850
All settings in 1080p profile are set to "default" except "Motion vectors precision" which is set to "Half pixel" due to low frame rate
CPU usage exceed 95% if all set to "default" sad

I would like to ask how can I improve the performance (maybe by software?)
And how's the performance if I buy a recent graphics card that support OpenCL
Thank you so much ! smile

Post's attachments

external filter.png, 14.73 kb, 601 x 281
external filter.png 14.73 kb, 559 downloads since 2012-08-09 

internal filter.png, 28.87 kb, 589 x 469
internal filter.png 28.87 kb, 568 downloads since 2012-08-09 

Output.png, 33.78 kb, 598 x 481
Output.png 33.78 kb, 619 downloads since 2012-08-09 

Re: Default setting = highest setting?

raymondkwan
And how's the performance if I buy a recent graphics card that support OpenCL

1.5-2 times better according to SVPmark database
plus more smoothness and picture quality

Re: Default setting = highest setting?

THX! So... "default" setting is the best possible quality for today's high end hardware?  cool
Some settings like "Frames interpolation mode", "SVP shader", "Decrease grid step" having higher quality(?) to be chosen
what will be the hardware requirement if those "ultra high" settings are set too  roll

Re: Default setting = highest setting?

... How can I set the video processor used as Quicksync (QS) while I am having both the ATI HD4850 and Intel HD Graphics 3000 enabled in pc
Monitor is connected to HD4850 DVI
http://forum.doom9.org/showthread.php?t=163110 QS huge improvement!?

Re: Default setting = highest setting?

raymondkwan
what will be the hardware requirement if those "ultra high" settings are set too

6+HT cores @5GHz ?  big_smile

How can I set the video processor used as Quicksync (QS)

in LAV decoder properties

6 (edited by raymondkwan 11-08-2012 00:10:42)

Re: Default setting = highest setting?

WOW how about 2-way (possible?) or more Xeon E5-4650 (8C16T), Xeon E7-8870 (10C20T)  cool

Chainik
How can I set the video processor used as Quicksync (QS)
in LAV decoder properties

I tested a short AVC encoded 1080p video and I found that the CPU usage is nearly the same (around 70% CPU usage) as DXVA2 copy-back with the same SVP settings("default" with one pixel, Bilinear interpolation mode)...
no reduction in CPU usage  sad

If Default setting /= highest setting, then what is the quality level "default" given? "high" in SVPmark "real-life test"?
Thus I can know the performance level  neutral

Re: Default setting = highest setting?

I dunno what you've expected but it's not a big surprise that two HW decoders have similar CPU consumption.

"default" settings are the best BUT:
1. I personally think 21th "SVP shader" is better than 13th (though this didn't affect performance in GPU mode)
2. Some believe "Decrease grid step" helps and in that case "by two with global refinement" is almost an ultimate value

So it's more like "highest" test in SVPmark wink

but you should know exactly where to look to find any difference between "decrease grid step" modes