Since the Apple event, everyone has been going mad over resolutions. The main difficulty, as far as I can tell, is that the subject has never been broached, and so people haven’t been told what to think.
“I’ve been using 100 dpi (or less) for ever years, and it’s suited me fine,” they say.
So, what is the resolution of the human eye? It’s hotly contested, of course, and there doesn’t appear to be a standard way of measuring. In general, though, you would measure a line pair to find the distance or the print size at which those lines blend together.
A quick test on my 100dpi monitor puts it at about three feet, for me. Other research has shown various measurements, such as about 125 line-pairs per inch at about foot. The important thing to realize is that those are line-pairs, not rows of pixels. You would need two pixels to render parallel lines, so the resolution is more like 250 dpi. This is the realm where the average human eye starts having problems, but there are, of course, many different edge cases where people could see individual pixels even at this resolution and this distance. 300 dpi has been accepted in popular belief as the general limit of human eye-graininess. If you want high fidelity, of course, 600 dpi is good. If someone were examining a material at close range, 1200 is better, though 2400 is probably what you’d want for anything of truly high quality.
So, as far as monitors go, 200 dpi is just fine at a length of two or three feet. Once you get one foot away or closer, as mobile devices usually are held, you’ll want something higher; at 400-600 dpi.
If you got a huge 1920×1200 display at only 200 pixels per inch, it would measure 2,264 pixels diagonally, which works out to 11.32″. At 224 pixels per inch, you could fit that into a netbook.
Of course, such technology could be prohibitively expensive, so we’re likely to stick with increasing the density of smaller mobile screens, for now. Besides; I don’t believe we have the infrastructure to be shuttling movies around in quad-HD.