Big ideas about the web.

Thoughts on the simplicity of computing

The announcement of the iPad by Apple at the end of January has fueled a discussion trend among designers and, more generally, people working with or around technology, concerning what might be called the simplicity of computing. While it’s not a new discussion, it seems to have become more and more urgent.

(By the way, I’m well aware that what might seem urgent to me, or to other people who work in the tech field, will be of very little interest to others. I’m not blinded by the illusion that technology topics should be of primary importance to everyone.)

I didn’t think I would write about this, because writers with wider readerships have done so already (see Fraser Speirs and, with a slightly different angle, Rob Foster), until I saw the question TUAW asked its readers yesterday, “What do you want to see in Mac OS X 10.7?” At first I dismissed it as the usual I-have-nothing-better-to-write kind of post (I was probably right on that), but then I realized I actually had my own answer.

One idea that emerges from Speirs’s and Foster’s posts is that people spend way too much time perfecting the way their computers work, or making them work in the first place, or solving problems that, in most cases and to most users, appear completely random. The fact that most computers are operated via a graphic user interface does not make it any less complicated.

People don’t read

I will go out on a limb here and say that the majority of users believe computers are hostile and cannot communicate. It’s a wrong belief, of course. Computers may not be friendly, but they’re not hostile either: they exist to solve problems and to help people get things done. Also, if anything, computers suffer from an excess of communication—and I think that’s what scares people. The average user gets lost in the excess of verbal and visual stimulation, which is, even in the best of cases, never really explained or understood.

People don’t read not because they’re lazy or stupid, but because they can’t possibly spend all their time trying to understand the machine. And yes, one would think that after more than twenty-five years of desktop metaphors, windows and menu bars some basic concepts would have started sticking—but it’s not always so. I have personally experienced frustration when talking to people who have been using computers (Macs, even) every day for the past ten or fifteen years, realizing that the word “menu” or “icon” meant squat to them.

When I give advice or support on the phone, one of my usual questions is “What do you see?” The most frequent answer I get is “Nothing.” To me, that would mean one of two things: my interlocutor has suddenly gone blind, or the computer screen has gone black. In reality, it means that the visual overstimulation leads to a loss of understanding of a few basic elements of interface design.

Topology (the way things are placed on the screen) is among the first to go: the user fails to understand the boundaries between things, so there are no more windows; the three-dimensional metaphor is also lost, so the user can’t see the idea of buttons being pressed or windows being on top of each other. The loss of control is so overwhelming that the user stops seeing words on the screen as meaningful entities, and sees them only as part of the machine: things the computer “puts there” but that are not really there for the user to understand or interact with.

The designers’ failure

The more designers understand the machine, the less they understand other people’s understanding of it. (I know that by using the word designers I might generate another misunderstanding: I don’t mean “people who draw,” I mean people who make things work the way they do.)

Let’s face it: when the Macintosh came out in 1984, very few people used computers. And back then computers meant using a command line to interact with them. The desktop metaphor was easy—and way more essential than it is today—but it was made easier by the fact that the average computer user was already familiar with what was underneath it: the file system.

Today, things are different. Even the most organized people I know—organized in life—sometimes cannot bother to understand how to properly use their computers’ file systems. Not even now, when a brand-new Mac comes with ready-made Documents, Movies, Images folders. Still, I don’t think that makes them lazy or stupid. It makes them people. The desktop metaphor has stopped being a metaphor, simply because people don’t see it as such. Desktop has become just a cute name someone gave to some screen on my computer, but I really don’t know why, and I don’t really care.

But some designers do not fail

Enter Apple. Really, it’s not because I’m completely smitten with its products that I always use Apple as a benchmark. It’s because, in most cases, whenever a new products comes out, I just feel Apple’s designers just get it. Better yet, they just get me. And with the iPhone and, now, the iPad, they do not just get me, they also get all those that are completely unlike me. (And again, designers are all those who contribute to making a product the way it is—whether it’s Steve Jobs himself and his ideas or the engineers who make it happen or the architects who make it look so damn cool.)

Apple understands that whatever metaphor worked for the users of personal computers back in 1984 might not work anymore. Or it might work for a subset of current users—a small one, at that—but what about the others? What about those who don’t need things to be complicated, who don’t need to decide where to put a certain file, who don’t need to spend one hour trying to find the file they so carefully misplaced—what about them?

I won’t go into the detail of why the iPad will work very well (maybe not yet perfectly) in this sense, because, like I said, the thought has already been expressed by other writers. Let me just say that the iPad aims to answer the question of “everyone else.”

What do I want from Mac OS?

Maybe it won’t be done in 10.7—maybe it’s something that will require a whole new version number. What I really want is a Mac OS that has two faces: one for me and one for everyone else. I mean, simply put, the ability to switch from a power-user interface to an iPad-like interface according to need.

Sure, the Mac already has the ability to restrict a user’s power to do things, by limiting the applications he can use or the interface elements he can tweak. But it still doesn’t tone down the complexity of the interface, and sure enough it still doesn’t make the user feel more in control—quite the opposite, in fact.

I can only imagine what such a paradigm shift would entail for all the software that’s currently available. I can imagine the resistance from certain software vendors—those that can’t do without their interminable menus with words even the savviest of techies can barely understand, or their palettes, or their multi-layered tool bars.

I’m not advocating for a general dumbing down of user interfaces. I’m not saying computer should just be simpler. If anything, they will only get more complex. But complex doesn’t have to mean complicated. The average user will be able to better appreciate the depth and richness of a computer’s design if he can experience it without feeling overwhelmed by it.

Mac users are always bragging about how easy to use their computers are—but are they really? To the inexperienced user they’re just as complicated (see the word I’m using here) as Ubuntu or Windows. The fact that experienced users are more willing to spend time helping friends and family set up their new Macs rather than solving their Windows problems doesn’t mean anything. Just imagine cutting down that set-up time from five hours to thirty minutes. Imagine a computer that is really understandable and usable (that is, immediately productive) within minutes after its first boot even by someone who’s never even heard the Mac chime before. Imagining it isn’t that hard now, since we know it can be done: the iPhone/iPad interface is just the appetizer.