Devices like the television, once at the center of our media lifestyle, will soon be relegated to temporary output displays provided for the phone. We can already see the first signs of this in the novel applications of Apple’s AirPlay technology. Playing Real Racing 2 on your phone? Have an Apple TV? With a few taps, the driver’s-eye view is sent wirelessly to the TV, and the phone itself provides the input hardware (an accelerometer-powered steering wheel) and displays useful secondary content (like the racetrack route). Here’s the surprising twist — in this scenario, both of these screens are being computed and rendered by the phone. The Apple TV is just passing through a stream of pixels for presentation on the TV’s larger display. This scenario illustrates how the iPhone and the iPad allow developers to design and render completely different user interfaces when connected to a large display.
The Nomadic Computing 2.0 model leverages the user’s superphone as a central computing core that powers a variety of user interface hardware. In other scenarios, the user’s superphone plugs into purpose-built forms appropriate to the task at hand. Recent Android-based products from Motorola (the Lapdock) and Asus (the Padfone) enable the phone to assume the form factor of a laptop or a tablet, respectively. Again, in the case of these products, the companion phone doesn’t merely dock with an existing laptop or tablet… it becomes each of these forms, with the help of a lightweight, modular screen, battery and keyboard.