Networking

The iPad isn't a third device, but a third revolution

A few years ago I brought my MacBook into an Apple Store to get it serviced–this was a first-generation model, which had the rather unenviable habit of spontaneously rebooting for no apparent reason.

The two Geniuses had looked over my computer with the same critical eye that an enthusiast might give a hot rod. “Look,” said one, “he's replaced the battery monitor in the menu bar. And he's got the Dock down in the bottom right of the screen.”d–e–

Like hot rodders, techies wear their tweaks and optimizations as badges of honor. To me, that's the chief distinction between power users and your average user: power users adapt computers to the way they work, instead of adapting the way they work to computers.

But something strange happened last week when I sat down at my MacBook after watching Steve Jobs unveil the iPad. I looked at all those little inscrutable icons in my menubar and saw them for what they were: hacks and shortcuts to “fix” the way the computer worked. Surely there must be a better way.

I was but a wee lad of four years old when Apple introduced the Macintosh in 1984 and first brought a graphical user interface to the masses. “Look,” Apple said, “computers are powerful, useful tools, but they're clumsy and inelegant. Let us show you a better way.” There was no shortage of resistance, especially from those who had gotten comfortable with typing their instructions at the blinking cursor.

Of course, the Mac was derided as a toy and not a tool for serious work, its mouse-driven approach deemed silly. While the Mac's market share remained small in the following years, the impact of its revolutionary interface was felt throughout the world–because every subsequent personal computer operating system followed the Mac's example.

And now, 26 years later, we're still interacting with our computers in fundamentally the same way: a cursor-driven interface in which we point, click, drag, arrange windows, use drop-down menus, and so on. Sure, the trappings have changed, but compare your Mac running Snow Leopard today with an original Macintosh running the first version of the Mac OS and the similarities largely outweigh the differences.

But as good as the Mac is, Apple realized that it wasn't good enough. Take the mouse, for example. There's a reason that Apple has insisted upon a single-button mouse for the last quarter century, even as its competitors have added extra buttons, scroll wheels, variable tracking, and more: Have you ever watched a complete novice try to learn to use a mouse? Before you even get to clicking–or right-clicking or scrolling–you have to learn how your movements translate into the movements of an arrow that flies around the screen. It makes a sort of sense, but I'd argue that much of that sense comes only because we're now used to it.

While PC makers tried to push computing forward by adding extra buttons and controls to try and provide more options for telling a computer what to do, Apple went in entirely the other direction, asking itself: how do we remove a layer of abstraction between the user and the computer?

That question eventually yielded the iPhone and the culmination of Steve Jobs's war on buttons. And it couldn't have come at a better time for Apple. As others have suggested, I suspect that the iPad was the device Apple had long wanted to release: a touchscreen replacement for the computer interface to which we've all become accustomed. But launching directly into such a product, even given the resurgence of the Mac and popularity of the iPod, would have been an uphill slog.”

The mobile phone market provided a perfect opportunity to test the waters. In 2007, when Apple announced the iPhone, cell phones had long been ubiquitous, but smartphones were still just catching on; most were still too complex for the average user. The device itself presented a smaller, more compact canvas on which Apple could put its vision to the test. “Look,” Apple said, “smartphones are powerful, useful tools, but they're clumsy and inelegant. Let us show you a better way.”

Seventy-five million devices later, it's clear that this idea has resonated with users. And, like the Mac, the iPhone has encouraged other device makers to follow suit and introduce touchscreen smartphones of their own. But for Apple, the mobile phone market was never the ultimate goal: the iPhone and iPod touch were a virus of an idea, infecting all those users with a new way of doing things. The touchscreen interface was part of that idea, but it wasn't the whole idea any more than the whole idea of the original iMac was that it was blue. All those competitors just slapping touchscreens on their phones were digging in the wrong place.

Even introducing it into a market that's been primed to accept such an idea, the iPad is a bold, ambitious product. The smartphone, as a category, was still fairly young when the iPhone was introduced; the vast majority of users didn't yet have habits to change. But the way people interact with their computers has remained largely static for 25 years–it has a lot of inertia, and it's harder to move something with a lot of inertia.

The improvements to personal computing over the last quarter century have been, to use an oft-quoted expression, more evolutionary than revolutionary. Changes have been gradual: the ability to run multiple programs, for example, or full color. But with every additional level of complexity comes an additional way of simplifying that complexity. Mac OS X's Exposé is a great example: it's a fantastically helpful feature, but it's indicative of what is wrong with the computer experience. It's a shortcut, a hack to deal with something that's inherently inelegant: the fact that we all have a huge mass of stacking, overlapping windows as a result of a three-dimensional interface shoehorned into a two-dimensional screen.

But Apple's been experimenting for some time on the simplifications that the iPhone embodies. Take the introduction of iTunes (née SoundJam MP). Before iTunes, playing music on your computer involved programs more like Apple's QuickTime Player: you interacted with a music file in the same way that you interacted with a text document in your word processor. You went to the File menu, chose “Open File,” and then navigated through your folders until you found the music file you wanted to play. If you were willing to spend the time, you could organize your music into folders to make them easier to find. Later on came the ability to create playlist files, if you wanted to listen to a particular set of music.

iTunes abstracted that process: you no longer dealt with files, you dealt with music. The program handled organizing files on the disk; you worried about organizing your music into the way you wanted to listen to it. You could still use iTunes to pull out a file if you wanted to give a copy to someone else (where legal, of course), but most of the time, you just want to listen to music, and iTunes simplified that. Apple then repeated the approach with iPhoto (although interestingly, it's never been quite able to decide if videos should live in iPhoto or iTunes).

The key here, as with the iPhone, is to abstract the nitty-gritty details of the underpinnings and remove obstructions in the way you do things. Much of the negative response to the iPad seems filled with anger–which, as Yoda adroitly pointed out, stems from fear–and it mostly comes from the kind of power users who like dealing with the underpinnings.

But I don't think the iPad heralds the death of the personal computer or, as many people seem somewhat strangely concerned about, the end of tinkering. It's not as though the iPad is going to murder curiosity. Some complain that Apple keeps locking out the jailbreakers with every revision of the iPhone OS, but the key point there is that the jailbreakers keep finding a way in. Cars are harder to tinker with today, but that hasn't stopped people from becoming mechanics. It's just that the vast majority of people don't care how it works under the hood, as long as it gets them from point A to point B.

For Apple, it's not about killing off tinkerers, but ensuring that not everybody who wants to use a computer has to be a tinkerer.

Few people mourned the damage the personal computer dealt to the typewriter, and most of those who did were either a) fueled by nostalgia or b) people who made typewriters. Few people mourned the damage that e-mail and the Internet dealt to the fax machine–in fact, we're mostly just pretty ticked off that the fax machine is still persistently clinging to life at all. In both cases, people embraced the new technology because it was, well, better.

This is the next phase of computing. Apple's not the only one to realize it, either. The approach of Google's Chrome OS is pretty different from what Apple is doing with the iPad, but it's not hard to see that it's aiming at the same target: making computing easier for the average user. I wager that we'll see a touchscreen tablet running Chrome OS within a year of the software's release, though I am skeptical of how effective that combination will be.

The iPad won't kill the computer any more than the graphical user interface did away with the command line (it's still there, remember?), but it is Apple saying once again that there's a better way. Regardless of how many people buy an iPad, it's not hard to look forward a few years and imagine a world where more and more people are interacting with technology in this new way. Remember: even if it often seems to do just the opposite, the ultimate goal of technology has always been to make life easier.

Previous ArticleNext Article

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

GET TAHAWULTECH.COM IN YOUR INBOX

The free newsletter covering the top industry headlines