That iPhone 4 Display Thing: Sorting Out Retina Display

| Just a Thought

During the WWDC keynote Steve Jobs introduced the next version of the wildly popular iPhone. The new device, iPhone 4, has new display technology that decreases the size of each display pixel while increasing pixel density to a point where it is hard for the average personal to see individual pixels at a certain distance. The net effect is sharper images. Apple calls this technology Retina Display.

Jobs said, “…there’s a magic number, right around 300 pixels per inch, that when you hold something ten to twelve inches from your eyes, is the limit of the human retina to differentiate the pixels.”

What he wanna say that for?

A brainiac and owner of DisplayMate Technologies (yes, he’s selling something), Dr. Raymond Soneira, PhD, disputed Jobs’ statement, saying that the pixel density needed to be half again denser than what the iPhone 4 sports to be properly called a retina display. In an email to Jason Cross at PCWorld, Dr. Soneira said, “Steve Jobs claimed that the iPhone 4 has a resolution higher than the retina - that’s not right:

1. The resolution of the retina is in angular measure - it’s 50 Cycles Per Degree. A cycle is a line pair, which is two pixels, so the angular resolution of the eye is 0.6 arc minutes per pixel.

2. So if you hold an iPhone at the typical 12 inches from your eyes, that works out to 477 pixels per inch. At 8 inches it’s 716 ppi. You have to hold it out 18 inches before it falls to 318 ppi.

So the iPhone has significantly lower resolution than the retina. It actually needs a resolution significantly higher than the retina in order to deliver an image that appears perfect to the retina.”

Other folks have weighed in to the controversy offering opinion and math adjusted to suit number commonly used for measuring human visual acuity.

The resulting media crap-storm is, at least at the surface, laughable and strangely akin to Congressman Wilson’s “You Lie!” outburst to President Obama. The problem is that the more I think about this situation, the more the implications give me less reason to laugh and more reason to be concerned.

Who’s right?

I’ll admit I’m not smart enough to dispute Dr. Soneira’s assertion on my own, all I have is the Internet. The doctor’s got numbers to back up his math and I’m just ok with numbers, but what I’m seeing in other, older articles about human visual acuity leaves me scratching my head.

For instance, Dr. Soneria says, “the angular resolution of the eye is 0.6 arc minutes per pixel.” My search says that 0.6 arcminutes is about .01 degrees of arc, however, many other sites claim that human visual acuity ranges between 0.2 to 0.3 degrees, which equals about 1.2 arcminutes if we use 0.2 degrees.

Enter Phil Plait over at Discover.com. Phil worked through the math ( and I checked it …snicker) using 1 arcminute, which is a number agreed to by astronomers and other smart people the world over as being the average person’s ability to discern an object as being more than a dot at a given distance.

According to Mr. Plait (not a doctor nor does he have alphabet soup behind his name), if 1 arcminute is used then Jobs’ claim is correct, with some room to spare (which Jobs also claimed). Here’s an excerpt from Mr. Plait’s blog:
“Jobs claims the iPhone held at 12 inches from your face has pixels too small to be resolved by your eye. Soneira, the display expert quoted in the magazine articles, disputes that. He uses the 0.6 arcmin resolution for the human eye (so we use the scale factor = 5730). Let’s use that and run the numbers.
Something 12 inches away means your eye can resolve dots that are bigger than

12 inches / 5730 = 0.0021 inches

So if the pixels on the iPhone are smaller than 0.0021 inches in size, then Jobs is right. Your eye won’t resolve them. If the pixels are bigger, Soneira is right, and your eye can resolve them.
The actual iPhone 4 has 326 pixels per inch (the display is 960 pixels high, and about 2.9 inches in length). You have to flip that to get the size of the pixel in inches:

1 / 326 = 0.0031 inches

Uh oh! Things look bad for Jobs. The iPhone pixels are too big! At one foot away, your eye can resolve the pixels, and Jobs must be lying!
Or is he? Remember, Soneira used the 0.6 arcmin resolution of the eye, but that’s for perfect eyesight. Most people don’t have perfect eyesight. I sure don’t. A better number for a typical person is more like 1 arcmin resolution, not 0.6. In fact, Wikipedia lists 20/20 vision as being 1 arcmin, so there you go.
If I use 1 arcminute instead, the scale factor is smaller, about 3438. So let’s convert that to inches to see how small a pixel the human eye can resolve at a distance of one foot:

12 inches / 3438 = 0.0035 inches

Aha! This means that to a more average eye, pixels smaller than this are unresolved. Since the iPhone’s pixels are 0.0031 inches on a side, it works! Jobs is actually correct.”

If the math has left you wide-eyed then it basically says that Jobs’ claim is based on commonly accepted numbers for measuring visual acuity, and based on those numbers, Jobs is correct for anyone with average vision. If you’re able to see a lot better than 20/20 then you would see pixels on the iPhone 4 screen.

In another email to Jason Cross Dr. Soneira defends his use of 0.6 arcminutes as a measuring standard. He says, “ There have been some comments that my analysis is for perfect vision. Jobs’ statement is for the *retina* not the *eye* with a poor lens. If you allow poor vision to enter into the specs then any display becomes a retina display.”

It’s not just Wikipedia that lists average human visual resolution at 1 arcminute, as I mentioned earlier, that number can be found all over the Internet (see a few of these references below).

So, is Dr. Soneira right and does that mean Steve Jobs is full of it?

It seems that both are right depending on whether you take the purest point of view (Dr. Soneira) or the realist point of view (Jobs). The funny thing is that all of this could have been avoided if Steve Jobs had said, “…average human eyes…” instead of, “ human retina…” If he had, then any claim of “puffery” could have been avoided or dismissed. As it is, the media seems to have made a mountain out of this mole hill, and the good Dr. Soneira has made a name for himself at the expense of Steve Jobs.

Here’s the thing: what Jobs says on stage directly reflects on him as a person and Apple as a technology entity. If he makes false or exaggerated claims, then it dilutes Apple’s credibility. A lot of fun has been made of Jobs’ “Reality Distortion Field,” but most, if not all claims made while immersed in that field could be validated later once the affects have worn away.

This is different. While the display for the new iPhone will likely kick all manner of butt, to make unsubstantiated claims about it hurts Apple’s reputation as a reliable, dependable, and most of all, truthful company. Apple can’t afford that. No reputable company can. Luckily, Jobs’ display statements are NOT unsubstantiated and are, in fact, based on provable data, but the damage has been done by the tech news community who seems too eager to report any Apple related dirt without first checking the facts.

Now what?

I believe it’s time for another open letter from Steve Jobs to the public addressing the display issue. If Jobs wants people to believe him about Flash, and other controversial issues, then he needs to make it clear that this was not a breach of trust, and not an act of puffery, but a teapot tempest and is hardly worth all the attention.

During the WWDC keynote Steve Jobs showed that he is in touch with the people who use his products. He joked about many of the controversies surrounding the iDevices including Apple’s infamous application approval process. I’m hoping that he continues being open. This display issue is not a big deal in and of itself, but if left to fester it could haunt Apple and Jobs for years to come.

For more information about human visual acuity, check out Blaha.net along with the downloadable PDFs from the University of Arizona College of Optical Sciences and the National Institute of Standards and Technology.

Comments

Bosco (Brad Hutchings)

Vern, it’s essentially the same issue as how big a 1080p display to get for a room. 46” works if you’re sitting 8 feet or more back.

I was making a training/marketing video the other day, and I wanted to see how it played on my Nexus One (AMOLED display) and my iPhone 3GS. Side by side, indoors, brightness at comfortable levels, at regular viewing distance, the difference was night and day. The colors on the Nexus One are vibrant and pure. The iPhone “white” looks comparatively yellow.

I will have to do a side by side with a 4G if they let me in the Apple Store—I’m sure I’m on a watch list at this point. But I’d bet that at arms length, vibrancy of colors will still be more obvious than pixel density.

geoduck

I worked as an Astronomer and Astronomy Teacher for a few years. The figure we always used was 1. There are people with sub arcminute resolution. One of my Astronomy professors in college had that. That’s one person in nearly 50 years. They are few and far between. In addition with printers 300dpi is accepted as the highest useful resolution for the average reader. Sure you can print something at 2000dpi but unless you are planning on using magnification it doesn’t help. Lastly, as we get older even those that had perfect vision find it gets degraded with age, especially when you’re trying to focus close up, like 12”.

For reference the moon is ~30 arcminutes across.

True SJ could have put it better, but this is just more of the anti-Apple press puppies trying to create a ‘scandal’ by nipping and yapping. There are bigger fish to fry out there.

jfbiii

Steve Jobs claimed that the iPhone 4 has a resolution higher than the retina

Really? Is that what Steve Jobs actually said? Or is that just what Soneira decided to hear?

The quote from Jobs’ keynote always seems to be:

??there?s a magic number, right around 300 pixels per inch, that when you hold something ten to twelve inches from your eyes, is the limit of the human retina to differentiate the pixels.?

That’s not the same as saying, “the iPhone has a resolution higher than the retina.”

Vern Seward

Jfbiii: After thinking about it for a bit I came to the same realization as you, Jobs did not say that the iPhone 4 had a higher resolution than the retina. It’s a small point that’s gotten lost in all the hubbub.

Geo: I hear ya. It seems this intense scrutiny is even more prevalent these days now that Apple is so popular. Whatchagonnado?

Bosco: from what I’ve read the new display compares well for brightness and blows Amoled displays for clarity and definition. Like you, we’ll have to wait and see. I don’t think I’ll be disappointed.

Vern

exAppl088

Sorry to say, but you completely missed it with this analysis.

Steve jobs said, in creating the NEW marketing term Retina Display, “what we (as Apple) call the ‘Retina Display’”.  He defined this term, as you have quoted, with the following definition “There’s a magic number, right around 300 pixels per inch, when you (an normal person) hold something 10 to 12 inches from your (human physical) eyes, is the limit of what the human retina differentiate the pixels”. 

As far as i could determine no scientific definition of the term “Retina Display” pre-existed Mr. Jobs definition of it as a marketing term so he was not hijacking a scientific term with the same name. 

No scientist with any amount of letters after his name, can make an obvious bogus attempt to redefine the marketing term “Retina Display” created by Mr Job during his Keynote speech, or set himself up as someone who has any authority to claim that such a term is “in-correct” by attempting to re-define the term created by Mr Jobs, as IF this new term should be or was being used by Mr Jobs as a scientific term, which it was not.

Essentially, Mr R. Soneira, of DisplayMate Technology, is not qualified to re-define a marketing term (Mr Jobs is however) UNLESS it can be conclusively shown that this term was already in common and publicly known use with another meaning, in say, numerous public scientific reports, news publications, or any where else.

Mr Soniera and his own self interest us just TOO LATE to change the definition; it is entirely BOGUS to claim that this new marketing term can even be incorrect, since no one else had any prior definition for the two words “Retina Display”.

It is even MORE BOGUS, that the mass of “tech journalists” actually think that this specious attempt to hijack Mr Jobs definition, which was quite clearly about the vision of the Apple development staff (real users with human eyes) and potential customers who will be buying the iPhone 4, is even a valid activity. IMHO, it is a BOGUS activity.

This behavior is just so typical of those who ignore the context of a statement and it real meaning and attempt to de-construct it by switching the context (in this case morphing a marketing term into to a non-existent scientific term), and not even recognizing the fact that that is what Mr Soniera did, OR calling him out for doing it and no doubt doing it for his own self interest as a vendor of display technology ( as always; follow the money to reveal motivation!)

Bosco (Brad Hutchings)

Welcome to the United States of Apple, where marketing claims are not subject to scrutiny. Steve Jobs has personally been caught in numerous blatant marketing lies relating to the iPhone. So claims he makes will be subjected to all sorts of scrutiny whether the fanboys like it or not.

Nookster

Talking about marketing and other BS, I’m surprised there isn’t more bitching about iPad apps getting appended with “HD”, in all their 1024 x 768 glory.

exAppl088

It is typical of web noise makers to discount relevant content not by giving any evidence that what has been stated is in any way incorrect, but to presume that the author statements must be invalid, due to their own un-supported beliefs about the author.

I quit working at Apple in 1982 because of Steve Jobs self-centered management and chose to be laid-off in 1998 as the last person of the Newton group, rather than find another position at Apple, BECAUSE the Apple board let Steve Jobs hijack the board of directors and take over Apple after Apple bought Next. 

I use Macs because they are better more reliable desktop computer systems (written on a 2000 PwrMacG4DP ); i don’t own an iPhone or iPad, but the new iPhone 4 is, IMHO, a real advance in capability that no one else has even begun to approach and probably will not for at least 12 month or more.

Bosco’s idea that Steve Jobs marketing claims are any more correct or incorrect than the marketing claims of Microsoft or any other technology company is obviously nonsense. 

Mr Jobs has always been about style over substance, and ease of use over “specs”, as he doesn’t and has never had any technical training or expertise, but he also doesn’t claim to be a technocrat.

Mr Jobs is an expert at fostering the development of excellent, easy to use products for the 95% of non technocrat users in the world, and an expert at marketing those products to millions of people who chose to buy them.

This latter fact is no doubt what creates so much irritation about Apple by the technocrats.  Mr Jobs claims to know how to make great product for ordinary people and there is plenty of evidence that is true, even if that reality is invisible to many blinded by Mr Jobs style.

Mr Jobs creation of the term “Retina Display” was not a technical claim, it is just a marketing term to describe the obvious advance in the quality of the display which ordinary users will find in the iPhone 4.

Long live the marketing term Retina Display!  i expect it will be adopted by many other vendors.

doogie

I don’t think that 20/20 or better vision is all that rare, but I work with a lot of pilots and perhaps my sample is skewed.  How many people have 20/20 vision with corrective lenses?  Also, people giving his claim the benefit of the doubt use the 12” numbers, but he said “10-12 inches”.  For his claim to be true at 10”, a higher resolution would be required.  This whole debate is a rathole and a sideshow, but it was precipitated by a verifiable claim made by Jobs backed by poor fact checking.

In the end you’ll either like the display, or not, or not care.  It will sway you subjectively towards or away from this device some small amount, regardless of what Jobs claimed and also whether it was true or false.

Vern does make a valid point about credibility, but Jobs’s RDF precludes it being a requirement.

Voice

In another email to Jason Cross Dr. Soneira defends his use of 0.6 arcminutes as a measuring standard. He says, ? There have been some comments that my analysis is for perfect vision. Jobs? statement is for the *retina* not the *eye* with a poor lens. If you allow poor vision to enter into the specs then any display becomes a retina display.”

Dr. Soneira, with this quote, demonstrates that he’s willing to twist his own statements and other people’s into completely different meanings in order to be correct.

How?

Without a lens, poor or not, the retina doesn’t resolve anything other than a blurry, unintelligible mess.  Without the rest of the eye (or at least certain other key components), our retinas wouldn’t resolve a pixel 2” across, much less sub-millimeter ones.

As for Bosco?
With his logic, any current car manufacturer who claims their vehicle seats even one person is guilty of false advertising because people exist who can’t fit into them.

doogie

While talking subjectivity, I prefer the iPhone’s more realistic colors to those of the oversaturated AMOLED Nexus One display.  The Nexus One reminds me of the TVs on display at Best Buy.  There’s no arguing taste, though, and mine is swayed by sitting in front of a meticulously color calibrated monitor all day.  (Of course, the contrast on the Nexus One is delicious.)

Anybody out there using their phone as a color calibration device?  No?

Bosco (Brad Hutchings)

Oversaturation isn’t the right word. N1 colors are vivid. I know what you’re getting at, but it’s not oversaturation. When scenes are oversaturated, you get pixel drift. Colors fall off over edges. Not so with the N1’s screen. But if you’re used to looking at a matte screen all day, yeah, the AMOLED packs a hell of a color punch.

The iPhone 3GS’s screen looks downright dull compared to the N1, especially indoors. YouTube is the best example. Videos on the N1 just pop. To be fair, the 3GS is much more readable outdoors.

But all of this is subjective. Which, getting back to the point, is what makes Steve Jobs’ pseudo-technical claims more specious. It reminds me of screen sharing, something I know a little bit about. When people dive into exact numbers for latency and throughput, it’s easy to conclude that any screen sharing product sucks balls. When you put it in front of your face, and you find yourself having an interactive conversation with someone across the continent or on the other side of the world and the screen latency isn’t getting in the way, it’s magic. Even if there are 3 to 4 seconds of latency.

I get exAppl088’s point. But I say if Steve is going to put numbers and figures into his message to back up or motivate the subjective stuff, those numbers and figures are going to be dissected. That’s how the world works.

Bronco46blogs

GOOD GRIEF! Pick up the iPhone. If you like the way it looks, buy it, other wise buy some other phone.
Arguing about minute measurements seems a silly exercise.
Clearly there are a lot of people without much enough to do.

exAppl088

Nookster comment seems a odd, considering that the term “HD” was the creation of a committee that could not even agree that the marketing term “HD” was not for any one resolution!

Is HD 1366 x 720 or 1920 x 1080i or 1920 x 1080p or 480p?  Which is it?  (can All of these be correct???)

Considering that this marketing term is now applied to everything under the sun, including
sunglasses, of all nonsense, I can see find no surprise in this non technical marketing term being
applied to iPad applications. 

Blame the ATSC “standards” committee for a complete failure in defining a meaningful standard for “HD”.

So many have hijacked the term by now and this isn’t a behavior unique to iPad Apps or even the iPad itself,
since the only real meaning today for HD is “Higher Definition” where the base was NTSC video or about
380K display pixels, thus even an iPad is higher definition, even if Apple doesn’t use the term in iPad marketing
materials.

farmerbob

Since the I heard that Jobs used the analogy of a Laser Printed Page (I haven’t had time to watch the keynote) . . . In the dark days before all this home printing and output was around, I owed a “service bureau” that laser imaged electronic files to film so that a printer (person) could burn plates, put them on a printing press and print all those crystal clear pretty things. In doing so, you had to make sure the resolution of the photographs were so that you got the proper linescreen to print clearly. I had two brands of output gear, Linotronic and Scitex. The Lino’s would image at 1200ppi (pixels per inch) and the Scitex would do 2400ppi. That was my Cadillac machine. The industry axiom until ppi and dpi had gotten muddled together was that 1200ppi was the limit at which there was no use going any further because the human eye could not distinguish the difference after 1100. We did little business on the Scitex and the Lino output would allow photo (no dots, lines or anything seen by the unaided human eye) clarity. So 1100ppi is the threshold of the human eye. I made living from it for many years producing output for many high-end magazines and advertising. These days with direct to plate and print, there is no need for service bureaus.

Nookster

Nookster comment seems a odd, considering that the term ?HD? was the creation of a committee that could not even agree that the marketing term ?HD? was not for any one resolution!

Is HD 1366 x 720 or 1920 x 1080i or 1920 x 1080p or 480p?? Which is it?? (can All of these be correct???)

Considering that this marketing term is now applied to everything under the sun, including
sunglasses, of all nonsense, I can see find no surprise in this non technical marketing term being
applied to iPad applications.?

Blame the ATSC ?standards? committee for a complete failure in defining a meaningful standard for ?HD?.

So many have hijacked the term by now and this isn?t a behavior unique to iPad Apps or even the iPad itself,
since the only real meaning today for HD is ?Higher Definition? where the base was NTSC video or about
380K display pixels, thus even an iPad is higher definition, even if Apple doesn?t use the term in iPad marketing
materials.

Coincidently, I think it’s odd that you think it’s odd smile

The 480p standard is commonly referred to as EDTV nowadays, unless of course you think that the Nintendo Wii is an HD console.

I would imagine that the bastardisation of such terms will continue though, to think I was playing first person shooters in the last decade in stunning 4:3 1024x768 HD, and watching PAL broadcasts decades prior in jaw-dropping 576i HD. Cool.

exAppl088

I would agree that the Nintendo Wii is not an HD console, but it clearly is a “better gaming experience” in the view of a huge number of users due to its creative GUI and input devices, which just underlines the my point that ease of use and and excellent GUI is far more marketable than just “HD” specs, which appeals to technocrats, like you and me.

FYI, I started my first programming job in Jan 1969, before 8 bit micro-processor’s existed, working for Control Data on the grand father of RISC computing, the CDC 6600.

Nookster

I would agree that the Nintendo Wii is not an HD console

...and neither’s an iPad, so there you go raspberry

Voice

@farmerbob
Ok, you claim experience with industrial-grade printers, so you *should* know this, but apparently you don’t.  Printers do not have ppi (pixels per inch) ratings.  They have dpi (dots per inch) or lpi (lines per inch) ratings (depending on the methods used).  In the dpi category, you’ve got to be aware that a dot is not equivalent to a pixel on a display.

A single modern pixel is capable of displaying 256 (or more, given the proper hardware) different levels of each color, from pure white to pure black.  A single modern dot is capable of displaying two shades of any given color (‘on’ or ‘off’).  It takes multiple dots from a printer to achieve the same range of colors displayed in a single pixel on a display, so claiming that printing 1200dpi is even remotely equivalent to 1200ppi is simply false.

For a color image, 1200dpi is roughly equivalent to 150ppi on a display (closer to 200ppi in practice due to dithering techniques in printing).  Modern industrial print hardware is capable of significantly more than that, of course, but 300ppi is the generally accepted equivalent in this day and age.

Industrial printers do, of course have the option of selecting specific color inks when they want to get really crisp, but that’s done for text and graphics, not photos simply because getting the color separation for a photo using arbitrary individual color halftones would be absurdly time consuming and difficult.

This is also why misinformed people thought for the longest time that you needed absurdly high resolution digital cameras to be able to produce better than wallet sized images.

farmerbob

Ok, you claim experience with industrial-grade printers, so you *should* know this, but apparently you don?t.  Printers do not have ppi (pixels per inch) ratings.  They have dpi (dots per inch) or lpi (lines per inch) ratings (depending on the methods used).  In the dpi category, you?ve got to be aware that a dot is not equivalent to a pixel on a display.

@Voice
Apples (no pun) and Oranges. I never said 1200ppi = 1200dpi. Although I did say that the terminology has gotten mashed up. No where in what I wrote did I use the designator DPI. Please reread. I know exactly what I am talking about (it served me very well for close to 30 years) as obviously you are not following my track. In the beginning of my post I said that “since Jobs compared the rez to that of the a Laser Printed Page” not the best analogy be it color laser or GS (Grey Scale). Because color laser is more like inkjet and really does not use dots, but more of a mesh. It’s the devil to get a line screen reading off an inkjet or color laser print when rescanning it because there is no defined dot pattern. Also because as I stated in the print world (printing presses not video displays), which is not what you are talking about, Pixels makes Dots, Dots make Line Screens. And print pixels have one level of existence/state of being, black and as small as the rez of the imager can produce which is where we know that 1100ppi is the visual threshold as apposed to a display pixel in which they have three main states Chroma, Hue, and Luminance. And the resolution is noted in xP x xP with “Dot Pitch” that opens a whole new can of worms. A Dot Pitch of .23 to .20 (the lower the finer the display and the “crisper” the detail) was optimum for Graphic level CRT’s. Three things that comprise resolution of a visual display is the screen size (which I see so much griping about them being too large), the PxP, and the dot pitch. The lower the dot pitch the closer the pixels and the higher the rez of the display because more fit in the given screen size. The analogy was wrong in comparison using that of a “Laser Printed Page”. GS Laser printers do use dots, whereas, InkJet, now the most common printer, does not. It’s that mesh that lets you get away with murder when printing low rez photography. The process covers a multitude of sins or the lack of resolution.

Industrial printers do, of course have the option of selecting specific color inks when they want to get really crisp, . . .

Now what kind of “industrial printers” are you talking about? If the kind that produces the magazine you read at your eye doctors office, you are wrong. If you are talking inkjet printing devices, you’re still wrong. In both cases a base of four colors are laid down, CYMK, Cyan, Yellow, Magenta and Black. I never knew why they use K for Black? In more contemporary Inkjet printers they have added Light Magenta and Cyan. I guess this is to help get a closer match to the RGB of a display so that people don’t freak when the output doesn’t match the display. But in none of these cases is it the ink that denotes resolution. But in OFFSET Printing you can vary the number of inks used up from the base four to add a PMS (Pantone Matching System) spot 5th color or varnishes that make parts of the print job glossy or matte (flat). But it is the resolution/finess of the dot on the plate for offset and the proper file resolution for an inkjet. But in none of these cases have I ever heard of using ink as a factor for “crispness”. It’s all in the resolution capability of the output device and the resolution of the file that it is given to “help” it attain the goal of the best looking “to the human eye” output. Low rez = fuzzy/mushy output. The higher the rez the “Crisper the output. On anything. That is why special care is needed in attaining a higher resolution for “crisper” images thus High Mega Pixel camera resolutions. THAT now allows digital photography to be used in high-end offset printing. Although you are right about not needing this for “wallet” sized inkjet prints. And none of this is absurd, it’s the law of the land in “professional printing/publishing”. Now in printing inkjet wallet pics, you’re right there. But is not what I was talking about. I’m sorry I was not more clear or “crisper” and confused you. AND then there is Dot Pitch in video displays. That seems to have gotten put aside these days as the quality of most displays has gotten lower as they try to fit pricing points.

Now this Wiki Page has an excellent and long discussion of the Pixel as it should apply here and Jobs should have used as a comparison/explanation. The Laser Print analogy was not right.

Log-in to comment