Advanced HTPC users adopt specialized renderers such as madVR which provide better quality when rendering videos. Unlike the standard EVR-CP (Enhanced Video Renderer-Custom Presenter) which doesn't stress the GPU much, renderers like madVR are very GPU-intensive. This has often been the sole reason for many HTPC users to go in for NVIDIA or AMD cards for their HTPC. Traditionally, Intel GPUs have lacked the performance necessary for madVR to function properly (particularly with high definition streams). We did some experiments to check whether Ivy Bridge managed some improvements.

Using our testbed with the 4 GB of DRAM running at DDR3-1333 9-9-9-24, we took one clip each of 1080i60 H.264, 1080i60  VC-1, 1080i60 MPEG-2, 576i50 H.264, 480i60 MPEG-2 and 1080p60 H.264. We tabulated the CPU and GPU usage using various combinations of decoders and renderers. It is quite obvious that using madVR tends to drive up the CPU usage compared to pure DXVA mode (with EVR-CP renderer). This is because the CPU needs to copy back the data to the system memory for madVR to execute the GPU algorithms. A single star against the GPU usage indicates between 5 - 10 dropped frames in a 3 minute duration. Double stars indicate that the number of dropped frames was high and that the dropping of the frames was clearly visible to the naked eye.

  DDR3-1333 [ 9-9-9-24 ]
  madVR 0.82.5 EVR-CP 1.6.1.4235
  QuickSync
Decoder
DXVA2
Copy-Back
DXVA2
(SW Fallback)
DXVA2 QuickSync
Decoder
  CPU GPU CPU GPU CPU GPU CPU GPU CPU GPU
480i60
MPEG-2
3 74 3 74 4 74 5 28 5 28
576i50
H.264
3 59 3 58 4 58 5 25 5 27
1080i60
H.264
14 86** 11 86** 14 81* 6 42 8 48
1080i60
VC-1
13 84** 13 80* 13 80* 13 47 8 47
1080i60
MPEG-2
12 82** 12 80** 9 78** 5 44 9 48
1080p60
H.264
18 97* 20 97** 18 96** 5 44 12 50

With DDR3-1333, it is evident that 1080i60 streams just can't get processed through madVR without becoming unwatchable. Memory bandwidth constraints are quite problematic for madVR. So, we decided to overclock the memory a bit, and got the G.Skill ECO RAM running at DDR3-1600 without affecting the latency. Of course, we made sure that the system was stable running Prime95 for a couple of hours before proceeding with the testing. With the new memory configuration, we see that the GPU usage improved considerably, and we were able to get madVR to render even 1080p60 videos without dropping frames.

  DDR3-1600 [ 9-9-9-24 ]
  madVR 0.82.5 EVR-CP 1.6.1.4235
  QuickSync
Decoder
DXVA2
Copy-Back
DXVA2
(SW Fallback)
DXVA2 QuickSync
Decoder
  CPU GPU CPU GPU CPU GPU CPU GPU CPU GPU
480i60
MPEG-2
2 76 2 76 2 73 5 27 5 27
576i50
H.264
2 57 2 57 3 57 5 25 5 24
1080i60
H.264
7 77 11 74 12 74 6 40 9 40
1080i60
VC-1
7 76 11 75 12 79 12 40 8 40
1080i60
MPEG-2
6 74 6 74* 8 75* 5 39 9 40
1080p60
H.264
13 82 14 84 14 80 6 41 10 42

However, the 5 - 10 dropped frames in the 1080i60 MPEG-2 clip continued to bother me. I tried to overclock G.Skill's DDR3-1600 rated DRAM, but was unable to reach DDR3-1800 without sacrificing latency. With a working configuration of DDR3-1800 12-12-12-32, I repeated the tests, but found that the figures didn't improve.

  DDR3-1800 [ 12-12-12-32 ]
  madVR 0.82.5 EVR-CP 1.6.1.4235
  QuickSync
Decoder
DXVA2
Copy-Back
DXVA2
(SW Fallback)
DXVA2 QuickSync
Decoder
  CPU GPU CPU GPU CPU GPU CPU GPU CPU GPU
480i60
MPEG-2
2 75 2 75 2 72 5 27 5 27
576i50
H.264
2 57 2 57 3 57 5 25 5 24
1080i60
H.264
7 74 11 73 12 74 6 39 9 40
1080i60
VC-1
7 74 11 74 12 77 12 39 8 40
1080i60
MPEG-2
6 74 6 74* 8 74* 5 39 9 40
1080p60
H.264
12 84 14 84 14 80 6 41 10 42

My inference is that a low memory latency is as important as high bandwidth for madVR to function effectively. I am positive that with a judicious choice of DRAM, it is possible to get madVR functioning flawlessly with the Ivy Bridge platofrm. Of course, more testing needs to be done with other algorithms, but the outlook is quite positive.

In all this talk about madVR, let us not forget the efficient QuickSync / native DXVA2 decoders in combination with EVR-CP. With low CPU usage and moderate GPU usage, these combinations deliver satisfactory results for the general HTPC crowd.

Custom Refresh Rates Acceleration for Flash and Silverlight
Comments Locked

70 Comments

View All Comments

  • Exodite - Tuesday, April 24, 2012 - link

    Anyone that says they can tell any difference between a 65% and 95% color gamut is whiny bitch.

    See, I can play that game too!

    Even if I were to buy your "factual" argument, and I don't, I've clearly stated that I care nothing about the things you consider advantages.

    I sit facing the center of my display, brightness and gamma is turned down to minimum levels and saturation is low. Measured power draw at the socket is 9W.

    It's a 2MS TN panel, obviously.

    All I want is more vertical space at a reasonable price, though a 120Hz display would be nice as well.

    My friend is running a 5ms 1080p eIPS display and between that and what I have I'd still pick my current display.

    End of the day it's personal preference, which I made abundantly clear in my first post.

    Though it seems displays, and IPS panels in general, is starting to attract the same amount of douchiness as the audiophile community.
  • Old_Fogie_Late_Bloomer - Tuesday, April 24, 2012 - link

    Oh, I know I shouldn't--REALLY shouldn't--get involved in this. But you would have to be monochromatically colorblind in order to not see the difference between 65% and 95% color gamut.

    I'm not saying that the 95% gamut is better for everyone; in fact, unless the 95% monitor has a decent sRGB setting, the 65% monitor is probably better for most people. But to suggest that you have to be a hyper-sensitive "whiny b---h" to tell the difference between the two is to take an indefensible position.
  • Exodite - Tuesday, April 24, 2012 - link

    Yeah, you shouldn't have gotten into this.

    Point being that whatever the difference is I bet you the same can be said about latency.

    Besides, as I've said from the start it's about the things that you personally appreciate.

    My preferred settings absolutely destroy any kind of color fidelity anyway, and that doesn't even slightly matter as I don't work with professional imagery.

    But I can most definitely appreciate the difference between TN and even eIPS when it comes to gaming. And I consider the former superior.

    I don't /mind/ higher color fidelity or better viewing angles, I'm just sure as hell not going to pay any extra for it.
  • Old_Fogie_Late_Bloomer - Wednesday, April 25, 2012 - link

    I agree completely that, as you say, "it's about the things you personally appreciate." If you have color settings you like that work on a TN monitor that you can stand to deal with for long periods of time without eye strain, I would never tell you that you should not use them because they don't conform to some arbitrary standard. Everybody's eyes and brain wiring are different, and there are plenty of reasons why people use computers that don't involve color accuracy.

    But as it happens, you picked a poor counterexample, because I defy you to put a Dell U2412M (~68% of aRGB) next to a U2410 set to aRGB mode (somewhere close to 100% of aRGB) and tell me you can't see a difference.

    For that matter, I challenge you to find me someone who literally can't see the difference between the two in terms of color reproduction. That person will have something seriously wrong with their color vision.
  • Exodite - Wednesday, April 25, 2012 - link

    To be fair the counterexample wasn't about being correct, because the poster I replied to weren't, but rather about showing what an asshat argument he was making.

    That said it's about the frame of reference.

    Would you be able to tell the difference working with RAW images pulled from your DSLR or other high-quality imagery?

    Sure, side-by-side I have no doubt you would.

    Would you be able to tell the difference when viewing the desktop, a simple web form or an editor where the only color are black, white, two shades of blue and grey?

    Especially once both displays are calibrated to the point I'm comfortable with them. (Cold hue, 0% brightness, low saturation, negative gamma, high contrast.)

    I dare say not.
  • DarkUltra - Monday, April 30, 2012 - link

    I'd like to see a "blind" test on this. Is there a percieved difference between 6 and 2ms? Blind as in the test subjects (nyahahaa) does not know what ms they look at.

    Test with both a 60 and 120hz display. I would guess the moving object, an explorer window, for instance, would simply be easier to look at and look less blurred as it moves over the screen. People used to fast paced gaming on CRT monitors or "3d ready" 120Hz monitors would see more of a difference.
  • Origin32 - Saturday, April 28, 2012 - link

    I really don't see any need for improvement in video resolution just yet. I myself have nearly perfect eyesight and can be extremely annoyed by artifacts, blocky compression, etc, but I find 720p to be detailed enough even for action movies which rely solely on the special effects. In most movies 1080p appears too sharp to me, add to that the fact that most movies are already oversharpened and post-processed and the increased bitrate (and therefore filesize) of 1080p and I see more downside than upside to it.
    This all goes double for 4K video.

    That being said, I do still want 4K badly for gaming, viewing pictures, reading text, there's tons of things it'll be useful for.
    But not for film, not for me.
  • Old_Fogie_Late_Bloomer - Monday, April 23, 2012 - link

    Another advantage of a 4K screen (one that has at least 2160 vertical resolution) is that you could have alternating-line passive 3D at full 1080p resolution for each eye. I'm not an expert on how this all works, but it seems to me that the circular polarization layer is a sort of afterthought for the LCD manufacturing process, which is why vertical viewing angles are narrow (there's a gap between the pixels and the 3D polarizing layer).

    In my opinion, it would be pretty awesome if that layer were integrated into the panel in such a way that vertical viewing angles weren't an issue, and so that any monitor is basically a 3D monitor (especially high-quality IPS displays). But I don't really know how practical that is.
  • peterfares - Thursday, September 27, 2012 - link

    a 2560x1600 monitor (available for years) has 1.975 times the amount of pixels as a 1920x1080 screen.

    4K would be even better, though!
  • nathanddrews - Monday, April 23, 2012 - link

    4K is a very big deal for a couple reasons: pixel density and film transparency.

    From the perspective of pixel density, I happily point to the ASUS Transformer 1080p, iPad 3, and any 2560x 27" or 30" monitor. Once you go dense, you never go... back... Anyway, as great as 1080p is, as great as Blu-ray is, it could be so much better! I project 1080p at about 120" in my dedicated home theater - it looks great - but I will upgrade to 4K without hesitation.

    Which leads me to the concept of film transparency. While many modern movies are natively being shot in 4K using RED or similar digital cameras, the majority are still on good ol' 35mm film. 4K is considered by most professionals and enthusiasts to be the baseline for an excellent transfer of a 35mm source to the digital space - some argue 6K-8K is ideal. Factor in 65mm, 70mm, and IMAX and you want to scan your original negative in at least 8K to capture all the fine detail (as far as I know, no one is professionally scanning above 8K yet).

    Of course recording on RED4K or scanning 35mm at 4K or 8K is a pointless venture if video filtering like noise reduction or edge enhancement are applied during the mastering or encoding process. Like smearing poop on a diamond.

    You can't bring up "normal" people when discussing the bleeding edge. The argument is moot. Those folks don't jump on board for any new technology until it hits the Walmart Black Friday ad.

Log in

Don't have an account? Sign up now