Blu-ray Remains Noticeably Better Than 1080p iTunes

| Analysis

I love Ars Technica; I think it’s one of the finest tech sites on the Internet. But I was a bit concerned by their article Wednesday morning comparing the quality of iTunes’ new 1080p content to Blu-ray. While the article concludes that Blu-ray is still the superior format from a picture-quality perspective, its comparisons, performed with a questionable methodology, led site after site yesterday afternoon to quote the analysis and declare “iTunes nearly as good as Blu-ray.”

While the definition of “nearly” is certainly open to debate, I hope to show that for videophiles and disinterested wives alike, the answer is “No, iTunes is not a good replacement for Blu-ray among those who care about picture quality.” 

First, my testing methodology is different. Ars used a camera to take pictures of their 1080p monitor displaying the content. Using Windows, one can get around the DRM limits that Apple places on iTunes content, so I was able to take screenshots of both the 1080p iTunes version and the Blu-ray, delivering a far more accurate comparison.

I used the same film, 30 Days of Night, as used in the Ars comparison, and took screenshots in uncompressed TIFF format. The images were completely untouched (you’ll notice drastically better colors in the Blu-ray) and were only converted to JPEG in the final step of uploading to the The Mac Observer’s servers. One thing to note is that iTunes stubbornly doesn’t let you dismiss the player controls when the movie is paused, so I tried to move the control panel away from the focus area on each iTunes screenshot.

What’s displayed below are side-by-side 300x300 pixel excerpts from each 1920x800 pixel image, with Blu-ray on the left and iTunes on the right. Click on each 300-pixel image to see the full screenshot. I’ll note that throughout the entire film, the colors on Blu-ray are superior, hands down. That alone may be reason enough for some people to choose Blu-ray, but these screenshots show that a large amount of detail is lost to Apple’s compression as well.

Blu-ray 1iTunes 1

Here, skin and facial hair is mushed together by Apple’s compression and becomes unnaturally smooth.

Blu-ray 2iTunes 2

This is a dark scene but you can see that the Blu-ray retains detail around the mouth; the beard is clearly visible. On the iTunes version, the detail is so far gone to compression that it looks like the character has no beard on the left side of his face.

Blu-ray 3iTunes 3

Here again, skin in the iTunes version loses all its fine detail. Skin is supposed to have detail, not be smoothed out like an airbrushed model. 

Blu-ray 4iTunes 4

Detail on the Blu-ray stands out on the blood splattered on the character’s face, the hair in the background, and the edges of his jacket.

Blu-ray 5iTunes 5

The female character’s eyes are sharper, her hair has detail throughout and the skin is realistic. The iTunes version blurs the skin and hair, and is softer overall.

While working on these comparisons, my wife, who normally doesn’t care about these things at all and will happily watch standard-definition Netflix and cable shows all day long, entered the room and saw the two versions of the film I had up on my screen. Without telling her what I was doing and why, she said “Wow, why does that one [the iTunes version] look so bad?” I then explained my purpose in comparing the two formats and she watched as I ran through the movie picking out screenshots. Without caring about Blu-ray or knowing which video belonged to which format, she identified the Blu-ray as being superior every time.

This article is not intended to tell you that Blu-ray is wholly better than iTunes content. There are many advantages to iTunes that Blu-ray doesn’t offer, and there are many advantages to Blu-ray besides picture quality, such as lossless audio, extra features, and the ability to loan or sell the film. So, by all means, tell me that iTunes is more convenient, is available any time without having to go to a store, and may perhaps be cheaper for new releases (although that advantage becomes a liability after a few months when retail Blu-rays are heavily discounted and iTunes HD purchases are still $20). 

But don’t tell me that iTunes’ current iteration of 1080p content is “nearly as good” as Blu-ray. They are worlds apart for any discerning viewer and those who care even a little bit about picture quality won’t be turning to iTunes for their favorite films any time soon.

Comments

FlipFriddle

I’ve found that DVDs I’ve ripped in Handbrake at 720p for the previous generation AppleTV don’t look as good as the DVDs themselves played from a DVD player, so I’m not surprised. I’m more surprised about the color difference than detail. That I think is more distressing.
Still, convenience trumps quality for a lot of people.

Lee Dronick

Still, convenience trumps quality for a lot of people.

Yup. We had this discussion at a party last Saturday.

Not to mention that in a year or so “they” will be wanting us to buy Orange-Beam or some other new type of optical media.

Ross Edwards

This just in: a 40GB file at 1080p is going to look better than a 3GB file at 1080p.  It’s a question of degree, and in terms of quality per size, the iTunes files are the clear winner.  Add in the omission of annoying prohibited user ops (forced ads and goofy menus) and the choice is clear.

I have the blu-rays for anything I consider very special and high-end or demoable, such as the extended LOTR trilogy, Tron Legacy, some Pixar, etc.  For the rest, iTunes will do nicely and you really do hardly notice any difference.  If the ATV3 fixes the framerate feed, it will be game over.

Also, FlipFriddle, DVDs are 480p so if you ripped them at 720p they are going to look like crap.  A properly ripped DVD should look exactly as good as its source, given that H.264 is far superior to MPEG-2 in compression quality at any given size.

Helge

Again I am surprised that there are still folks around witch honestly seem to think quality will get better when reencoding with relevat data compression algorythms like H.26x. The resolution will stay NTSC or PAL (in my case with 576 pixels) even when encoded with 720 lines. Reencoding done here will always worsen the picture quality noticeably.

But on topic: If iTunes uses like I expect half of the bandwidth of a standard BD (average 30 Mb/s), the picture will be only half as good; which means blurriness. The pictures above do show this very good actually. They could use the full bandwidth; but every viewer will need a VDSL connection at least. (There will be to my knowledge no better technology then h.264 for this kind of application in the near future.)

Lee Dronick

forced ads and goofy menus

Especially the goofy menus

Mikuro

I could be wrong, but I don’t think the color differences are really due to a difference in encoding format. It seems like they were encoded from different sources. Perhaps they decided that different color balances would give better results in likely use-cases for different formats. Apple’s glossy LCDs, for instance, are well known for having exceedingly high color saturation (you probably remember the fierce complaints from graphic artists when glossy LCDs first hit the scene), so perhaps this was done to counteract that. I don’t know. My point is, if the color difference was an intentional decision, it’s not really a fair comparison of formats. If the color difference was not intentional, I’d really like to know how it came about.

I’m sure iTunes’ quality WILL be worse no matter what, because AFAIK they are using the same encoding formats (H.264) and anything going over the Internet is bound to have a significantly lower bitrate than Blu-ray. You don’t mention the bitrates in the article, though. What does iTunes use?

I find HD to be somewhat overrated when it comes to streaming. “HD” and “1080p” only specify resolution, not quality, and not even effective resolution, which is what’s really important—e.g. upscaling a DVD to 1080p doesn’t give you any more effective resolution than the 480p original, because scaling does not work that way.

So most HD videos you see on, for example, YouTube, actually have pretty poor quality. Bitrate matters more than resolution. If you go from 720p to 1080p without at least doubling your bitrate, don’t expect a drastic difference.

furbies

Add in the omission of annoying prohibited user ops (forced ads and goofy menus) and the choice is clear.

+1 against the stupid enforced ads.

If I watch a BD, I load it up and go make coffee, or get munchies so I can skip the PITA trailers.

YodaMac

Sorry, but you show me a 3GB file compared to a 40GB Blu Ray and Yes, there are differences, but I find it AMAZING how good the iTunes file looks side-by-side to the Blu Ray!!!  Nearly as good AND only 3GB!!!!!
That’s what’s so awesome about it!

p.s.  I recommend you don’t sit so close to your TV as to discern actual hairs on someone’s eyebrows.  Didn’t your mother teach you anything about properly watching a movie.

smile

Mark Block

I think your methodology is indeed better for revealing the compression artifacts in the compressed iTunes files, but not necessarily so good for color differences. I’d rather have a Blu Ray player and an Apple TV each play out to a calibrated TV monitor via HDMI, then photograph the results. (It would also be nice to calibrate the TV to reference color bars for each source.) The gamma and color profile of a TV is different than a computer monitor, and Quicktime (the engine of which is used by iTunes for video playback) may be incorrectly interpreting the gamma of the encoded H.264 video file when playing it on your Windows computer.

I’m a video editor who has done color correction. There is no video reference standard for computer playback like there is for TV. The movie you used for reference was color corrected to look right on a TV (probably an expensive, carefully calibrated broadcast monitor). On a computer, all bets are off. The fact that Blu Ray looks better on your Windows system might just be an accident. Or not.

I have three monitors in front of me right now, two computer monitors and one TV monitor, with the TV and one computer monitor hooked up to a Kona card for video output. My own color-corrected videos look wildly different when playing out to the three monitors, even though they have all been set up for video as carefully as possible. The only one I can guarantee to be “true” is the TV (using Final Cut Pro and the Kona for playback). No one knows what gamma and color space issues are happening to your iTunes movie on a Windows system, so I don’t think it’s a valid test.

Mark Block

BTW, the fact that the two sources looked very similar in the Ars Technica test, while your comparison showed huge color differences, indicates to me that their test set up was probably more correct.

russell

But on topic: If iTunes uses like I expect half of the bandwidth of a standard BD (average 30 Mb/s), the picture will be only half as good;

No, you have missed the key point about compression.  Even if we can agree a measure of “good” then half the bitrate with good compression is much better than half as good, maybe ten times less bitrate is half as good.

Not only is some compression gain “loss.less” (e.g.  a .zip of document can be ten times smaller with no loss) but clever modern compression gives far greater size gain than quality loss.

===

In the early days of HDTV a famous visitor was being shown round the first ever HD TV BBC studio in London.  Seeing a monitor, he said something like: “That’s fantastic, far better than anything I can ever see at home.”  After an embarrassed pause he was told “That’s just a standard definition studio monitor, correctly set up and adjusted.” 

We are going to have a whole “Golden eyes” phenomena of people who can see differences to match the “Golden ears” of hi-fi (sorry - “extreme audiophiles”) where people can hear non-existent differences.

Most people are pretty happy with MP3 audio, at home some people bother to go to a much higher quality, but it is a hobby not a necessity.  A few even refuse to ever listen to recordings and only listen to live music (which makes more sense to me).

Good article and good comparison on blurring effects of compression - though if you bother to read the scientific literature it has all been measured before.  The colour is not a fault but an adjustment as has been pointed out.  Most people have chroma levels so high that reducing colour is a big improvement.

Russell

Helge

No, you have missed the key point about compression.

Actually I did not - but maybe I oversimplified it. However I did not want to discuss compression algorithms in depth here; I think nobody would be interested wink

But this much to defend my statement: The objective quality of an encoding is linear to the “bitrate” you use (at least with an idealistic source where the “quality” is indefinite). Codecs like h.264 try hard to alter the perceived quality however and are usually optimized to work better on lower bitrates.  But this is of course highly subjective; a discussion would be pointless. (As for me I like HQ stuff since I perceive compression artifacts as highly annoying; but friends of mine watch movies via 100kb/s streaming and feel ok, too).

russell

russell said:

No, you have missed the key point about compression.

Actually I did not - but maybe I oversimplified it. However I did not want to discuss compression algorithms in depth here; I think nobody would be interested

But this much to defend my statement: The objective quality of an encoding is linear to the ?bitrate? you use (at least with an idealistic source where the ?quality? is indefinite). Codecs like h.264 try hard to alter the perceived quality however and are usually optimized to work better on lower bitrates.? But this is of course highly subjective; a discussion would be pointless. (As for me I like HQ stuff since I perceive compression artifacts as highly annoying; but friends of mine watch movies via 100kb/s streaming and feel ok, too).

===

“Objective quality linear to the bitrate”  is a bold claim, and one that I think is not supported by the evidence.  For example:

Take a case where the sampling rate is very high, say ten times more than necessary, and sampling rate is slowly reduced.  Linear reduction in bit rate, but no change in quality until the sampling rate is suddenly not high enough (Nyquist Criterion) and then a sudden, very non-linear, drop in quality as aliasing starts.

Modern compression uses psycho-acoustic research results to effectively hide quite a lot of compression from most people, I maintain quality is highly non-linear with respect to bit rate for a lot of compression.

Russell

Helge

Linear reduction in bit rate, but no change in quality until the sampling rate is suddenly not high enough (Nyquist Criterion) and then a sudden, very non-linear, drop in quality as aliasing starts.

But issn’t this exactly what I said? Since Aliasing is a question of perception. Please read the section of the idealistic source. In this case there will always be aliasing.

And to my knowledge the clame ?Objective quality linear to the nitrate? is supported by simple logic; since you cannot have data magically have disappear and then reappear this will always be true: If I drop half of the data they are gone forever - remember we are talking about relevant compression here; the point is to drop data witch some guru - psychologists once deemed we would not recognize the difference if they where gone.

russell

russell said:

Linear reduction in bit rate, but no change in quality until the sampling rate is suddenly not high enough (Nyquist Criterion) and then a sudden, very non-linear, drop in quality as aliasing starts.

But issn?t this exactly what I said? Since Aliasing is a question of perception. Please read the section of the idealistic source. In this case there will always be aliasing.

Aliasing is not perception, it is when sampling rate is less than double or more the highest frequency.  Provided the sampling frequency is high enough there will be no (as in zero) aliasing.  Wikipedia can tell you more.

Helge

Aliasing is not perception,

I would say the opposite, but it’s of course applying to our discussion here. Aliasing is, in simple terms, data appearing from insufficient sampling rate, witch is wrong. Again: You will always have a liaising. Example: As for mastering a CD we apply a low pass fllter, throwing away higher frequencies (Shannon). This way we avoid most of the audible aliasing. Could we not leave this filter away if our sampling rate would be high enough to have the resulting aliased frequencies above our perception level?

Mikuro

I think the point Russell’s trying to make is that while quality is relative to bitrate, it is not necessarily linearly proportional. There is a point where doubling your bitrate will very little effect on quality, and there is a point where doubling your bitrate will more than double your quality.

I believe this is true both objectively (signal to noise ratio) and subjectively. But I’m going a little bit beyond my depth here, so maybe I’m wrong about the SNR part.

lujan

Great comparison, I thought that yesterday’s comparison was off somehow…  Does anyone know if there has been the same type of comparison between blu-ray and Vudu?

bob the sponge

Thx for the work. Very interesting.

I look forard to reading a long test somewheresometimeswith thefollowing:

- more than 1 film comparison (maybe 30 days has a problem)
- comparison with DVD, HD cable, SD cable, netflix, and SD itunes together (that way one would know were to look for what kind of content)

From what I ve seen here I wonder wether I d rather watch a DVD or HD Itunes

Well at least I will continue to buy BD to watch my favorite movies

cubefan

I guess this is why a buy once use anywhere video stream would be great,  UltraViolet does just that.  Trouble is, Apple is not a member :-( check out UVVU.

The debate over colour is just that - colour is in the eye of the beholder - but proper colour correction goes a long way to making realistic viewing.  I couldn’t believe just how good HD was until I saw an HD TV with a proper HD source, the detail is amazing and a laptop is not an ideal viewing platform. Just my two pennorth grin

Log-in to comment