This story was written by Keith Dawson for UBM DeusM’s community Web site Develop in the Cloud, sponsored by AT&T. It is archived here for informational purposes only because the Develop in the Cloud site is no more. This material is Copyright 2012 by UBM DeusM.

The Coming High-Resolution Revolution

Apple's Retina display is the harbinger of what's to come. Web developers and designers need to understand what the change means.

Apple's Retina display heralds of a new world in which developers cannot assume that their users will view their work at 100 dpi. The changes will be pervasive.

For more than 15 years, across a range of display technologies, computer displays held steady at around 100 dots per inch (dpi). Text, graphics, and applications as a whole could be designed assuming 100 dpi and they would display just fine across whatever computer screens or, later, smartphones they ran on. In June 2010, Apple changed the display game with the introduction of the iPhone 4 with its "Retina" display. The fallout from that introduction is still spreading, and it is past time for developers to consider the implications.

Apple uses the term "Retina display" because they claim that their devices' pixel density is sufficiently high that the human eye cannot make out individual pixels at the normal usage distance for each device. For the iPhone 4, iPhone 4S, and iPod Touch this translates to a pixel density of 326 dpi; for the iPad 3, 264 dpi; and for the Retina Macbook Pro, 220 dpi.

Few doubt that Apple's entire product line is moving to high pixel densities. There also can be little doubt that the rest of the computer, phone, and tablet industry will be forced to follow.

Marco Arment, creator of Instapaper, sent a tweet last month urging Web designers to go out and buy a Retina MacBook Pro "so you can see how bad your site looks on it and fix it." After a fair number of those designers resisted the idea, Arment laid out his reasoning in a blog post.

Arment notes that you can test a site using an iPad 3, or simulating an iPad 3 within Apple's Xcode. These methods will show you where a site looks bad -- which images are most in need of a 2x (double resolution) version in order to work well. What they won't show you, in the absence of day-to-day usage, is "the nuance of what looks good, and what works well," on a high-resolution display (the emphasis is Armant's).

John Herman, writing in Buzzfeed, argues that we may need to begin carrying our cameras again -- the ones we abandoned when smartphone cameras got "good enough." The problem, as Herman sees it, is that "good enough" is relative to the display quality to which we have been accustomed. Photos from an iPhone 3GS or iPhone 4 look terrible on a Retina display, in this writer's experience.

What is the best way to describe in markup the selection of image resolutions that may be appropriate for different devices? This question is the subject of major contention between a group of Web designers and the (mostly) browser developers who dominate WHATWG, the working group defining HTML5 and CSS3.

And this narrow controversy is one of the forces pulling apart the efforts of the WHATWG and the World Wide Web Consortium, which have been working jointly on HTML5 / CSS3 for years. But that is the subject of another blog for another day.

What are you doing in your development projects to accommodate the coming of ultra-high-resolution displays? Please let us know in the comments.