Before proceeding to the business end of the review, let us take a look at some power consumption numbers. The G.Skill ECO RAM was set to DDR3 1600 during the measurements. We measured the average power drawn at the wall under different conditions. In the table below, the Blu-ray movie from the optical disk was played using CyberLink PowerDVD 12. The Prime95 + Furmark benchmark was run for 1 hour before any measurements were taken. The MKVs were played back from a NAS attached to the network. The testbed itself was connected to a GbE switch (as was the NAS). In all cases, a wireless keyboard and mouse were connected to the testbed.

Ivy Bridge HTPC Power Consumption
Idle 37.7 W
Prime95 + Furmark (Full loading) 127.1 W
Blu-ray from optical drive 57.6 W
1080p24 MKV Playback (MPC-HC + QuickSync + EVR-CP) 47.1 W
1080p24 MKV Playback (MPC-HC + QuickSync + madVR) 49.8 W

The Ivy Bridge platform ticks all the checkboxes for the average HTPC user. Setting up MPC-HC with LAV Filters was a walk in the park. With good and stable support for DXVA2 APIs in the drivers, even softwares like XBMC can take advantage of the GPU's capabilities. The QuickSync decoder and DXVA decoder are equally efficient, and essential video processing steps such as cadence detection and deinterlacing work beautifully

For advanced users, the GPU is capable of supporting madVR for most usage scenarios even with slow memory in the system. With fast, low-latency DRAM, it is even possible that madVR can be used as a renderer for the most complicated streams. More investigation needs to be carried out to check the GPU's performance under different madVR algorithms, but the initial results appear very promising.

Does this signify the end of the road for the discrete HTPC GPU? Unfortunately, that is not the case. The Ivy Bridge platform is indeed a HTPC dream come true, but it is not future proof. While Intel will end up pleasing a large HTPC audience with Ivy Bridge, there are still a number of areas which Intel seems to have overlooked:

  • Despite the rising popularity of 10-bit H.264 encodes, the GPU doesn't seem to support decoding them in hardware. That said, software decoding of 1080p 10-bit H.264 is not complex enough to overwhelm the i7-3770K (but, that may not be true for the lower end CPUs).
  • The video industry is pushing 4K and it makes more sense to a lot of people compared to the 3D push. 4K will see a much faster rate of adoption compared to 3D, but Ivy Bridge seems to have missed the boat here. AMD's Southern Islands as well as NVIDIA's Kepler GPUs support 4K output over HDMI, but none of the current motherboards for Ivy Bridge CPUs support 4K over HDMI.
  • It is not clear whether the Ivy Bridge GPU supports decode of 4K H.264 clips. With the current drivers and LAV Filter implementation, 4K clips were decoded in software mode. This could easily be fixed through a driver / software update. In any case, without the ability to drive a 4K display, the capability would be of limited use.

Discrete HTPC GPUs are necessary only if one has plans to upgrade to 4K in the near term. Otherwise, the Ivy Bridge platform has everything that a HTPC user would ever need.

Acceleration for Flash and Silverlight
Comments Locked

70 Comments

View All Comments

  • MGSsancho - Monday, April 23, 2012 - link

    While I agree with most everything there is something I would like to nit pick on, While making a digital copy of old film in what ever format you use, more often than not a lot of touching up needs to be done. Wizard of OZ and all the 007 films can be an example. (I am ignoring the remastering of Star Wars and Lucas deciding to add in 'features' vs giving us a cleaned up remaster sans bonuses.) Still when your spending millions in remaster I expect at least not muddy the entire thing up.

    However I feel we need to bring in higher bitrates first. I will not apologize over this, yes encoders are great but a 4mbs 1080p stream still is not as good as nice as a 20mb-60mb vbr blu-ray film The feeling that a craptastic 4k or even 2k bitrate will ruin the expedience for the non informed. Also notice I am ignore an entire difference debate whether the current can candle true HD streaming to every household, at least in the US.
  • nathanddrews - Monday, April 23, 2012 - link

    Higher bit rates will be inherent with 4K or 2K over 1080p, but bit rates aren't the be all end all. 4K will likely use HVEC H.265 which offers double the compression with better quality than H.264.

    Fixing scratches, tears, or other issues with film elements should never be a reason for mass application of filtering.
  • SlyNine - Tuesday, April 24, 2012 - link

    H.264 doesn't even offer 2x the compression over Mpeg 2. I doubt H.265 offers 2x over 264.

    "This means that the HEVC codec can achieve the same quality as H.264 with a bitrate saving of around 39-44%."

    Source http://www.vcodex.com/h265.html
  • Casper42 - Monday, April 23, 2012 - link

    I LOL'd at "Walmart Black Friday" Nathan :)

    And for the OP, 32", really?
    Its completely understandable you don't see the difference on a screen that size.
    Step up to a 60" screen and then go compare 720p to 1080p (who uses 1080i anymore, oh thats right, crappy 32" LCDs. Don't get me wrong, I own 2, but they go in the bedroom and my office, not my Family Room.)

    I think 60" +/- 5" is pretty much the norm now a days for the average middle class family's main movie watching TV.
  • anirudhs - Monday, April 23, 2012 - link

    Cable TV maxes out at 1080i ( I have Time Warner). My TV can do 1080P.
  • nathanddrews - Monday, April 23, 2012 - link

    1080i @ 60 fields per second when deinterlaced is the same as 1080p @ 30 fields per second. The picture quality is almost entirely dependent upon your display's ability to deinterlace. However, cable TV is generally of a lower bit rate than OTA or satellite.
  • SlyNine - Tuesday, April 24, 2012 - link

    Yea but because of shimmering effects progressive images almost always looks better.

    If the video is 2:2 or 3:2 many tv's can build the frame in to a progressive image anymore.
  • Exodite - Tuesday, April 24, 2012 - link

    In the US, possibly, but I dare say 55-60" TVs are far from the norm everywhere.
  • peterfares - Thursday, September 27, 2012 - link

    2560x 27" and 30" monitors are NOT very pixel dense. 27" is slightly more dense (~12.5% more dense) than the standard display but the 30" is only about 4% more dense than a standard display

    a 1920x1080 13.3" display is 71.88% more dense than a standard display.
  • dcaxax - Tuesday, April 24, 2012 - link

    On a 32" you will certainly not see a difference between 720p and 1080p - it is barely visible on a 40". Once you go to 52"+ however the difference becomes visible.

    On a 61" screen as you suggest the difference will be quite visible.

    Having said that I am still very happy with the Quality of properly mastered DVD's which are only 576p on my 47" TV.

    It's not that I can't tell the difference, its just that it doesn't matter to me that much, which is why I also don't bother with MadVR and all that, and just stick to Windows Media Center for my HTPC.

    Everyone's priorities are different.

Log in

Don't have an account? Sign up now