Back during TED 2006, New York University research scientist Jeff Han blew his audience away with a demonstration of multitouch, “interface-free” computing. Although he claims the technology is far from new, it is unlike anything I have seen before. Take a look at the video:
Vodpod videos no longer available.
If that didn’t floor you, watch it again. Try interfacing that way with your tablet computer, your DS, or your smart-phone. Honestly, put the stylus away, and try to use multiple fingers on the display simultaneously. What happens? Either nothing occurs, or only one input is registered. This is one reason why touch screens have yet to catch on as mainstream computer interfaces. They offer no significant improvement over mouse and keyboard while retaining several drawbacks.
Contrast those interfaces with this. Jeff Han demonstrates ten simultaneous fingers and infers more could be detected. Look how smoothly everything works despite Mr. Han’s apologies for how rough this technology still is. Fast-forward to January 2007 and the announcement of the iPhone. Jeff Han says that he is not surprised that Apple is the first to bring something like this out, but I seriously doubt the iPhone will be the only product in which Apple bundles this technology.
In fact, the trackpad on the MacBook Pro I am using is sensitive to multiple touches. If I tap on the track-pad while another finger is resting on the pad, the computer registers this as a ctrl-click. If I slide two fingers across the pad simultaneously, the computer treats that input like a scroll-wheel – both vertically and horizontally. If I do this while holding down ctrl, the screen can zoom in or out, and I imagine that upcoming MacBooks may implement iPhone’s “pinching” gesture for this same functionality (at least in apps like iPhoto).
Of course, iPods with a similar display and interface as the iPhone are a very safe bet. However, let’s take this further.
- Resolution independence is a rumored feature of Leopard due out sometime between now and June. This allows screen zooming to be handled much more smoothly than is possible right now.
- Apple’s interfaces are primarily designed to be accessible without resorting to secondary clicks or hidden menus. There are exceptions to this, but Macintosh user interfaces would need little tweaking to be “hands-on” ready.
- iPhone is built on OS X, and it shares some core technologies with Leopard. Therefore, core iPhone technology could be efficiently ported to Mac OS X computers.
Yes, this evidence is (very) shaky, but I think Apple is the right company to get us rethinking interfaces again. Apple controls the software and hardware of their platform, and this will make such a shift less difficult than if Dell and Microsoft (for example) were trying to implement a similar approach. It just makes sense for Apple to be the company that starts pushing this kind of technology.
Keyboards and especially mice are not well-understood by the masses. I’m always helping adults mouse around their screens and click the correct buttons. They were a good solution when they came out, but over twenty years have passed with no significant progress. Just as Nintendo has pushed the envelope as far as gaming interfaces go, Apple is a natural choice to elevate computer interaction to new levels. I’m ready for the next big thing. I just hope Apple begins pushing this technology sooner rather than later.