The ‘OK’ Computer
The Apple Lisa was expensive, slow, and short-lived. But 40 years later, computers are still building on its legacy.

How many times in your life have you clicked “OK”? I personally have lost count. From my first clicks on a Macintosh in the 1990s, it’s been the ubiquitous language of assent for computing — an agreement to countless decisions, large and small.

At the birth of Apple’s desktop computing revolution, this was not the plan. The year was 1982, and a small team was testing a design dubbed the Lisa: Apple’s first stab at a machine built around images and buttons, not the text of a command line. A simple “OK” seemed inappropriately casual for a machine that, unlike Apple’s earliest homebrew machines, was catering to office workers. If you wanted to execute a command, you hit a button labeled “DO IT” — simple, straightforward, professional. Or so the team thought until they started putting people in a room with it.

“It wasn’t clear what they were having trouble with,” wrote Apple engineer Andy Hertzfeld later. But when a few users encountered the dialog box, they would freeze up: hitting the “cancel” button, backing out, and, in one case, getting visibly angry with the machine. The problem, the team realized, was the button’s font and spacing. Users weren’t seeing the two-word, uppercase phrase Apple had written. They were being asked to make a decision — often on one of the first computers they’d ever used — and seeing the machine calling them a dolt. OK, the Lisa team decided, might be a little bit informal… but at least it didn’t accidentally insult their customers.

The Apple Lisa, which celebrated its 40th birthday this month, is remembered as a glorious failure. Launched in 1983 for nearly $10,000 (about $30,000 today), it was available for less than four years, making it a quickly discontinued stepping stone between Apple’s early homebrew computers and its bestselling Macintosh. At the same time, it was a trailblazing attempt at one of the first graphical user interfaces — a machine that set the model for the computers we use today.

But the Lisa was also something more. Built on foundations laid by early computing pioneers, it represented one of the first attempts at a commercial computer built for humans, expressed in the form of changes like the “OK” button. The Lisa was one of the earliest machines designed to be instantly understandable, thanks not only to the intuitions of its inventors but also their careful observation of newcomers to computing. Along the way, it helped create not only the specific conventions of the desktop but a style of design that we now take for granted, even as it sits on the cusp of a fundamental change.

To understand how, we have to jump back a few more years to a pair of secret meetings in a Xerox research lab.

The Xerox Palo Alto Research Center holds a legendary role in computing history. In its 1970s and 1980s heyday, PARC produced a series of groundbreaking inventions: the ethernet protocol; the laser printer; and one of the first computers with a full-fledged graphical operating system, the Xerox Alto. It’s the Alto that helped set the Lisa team down its final path and gave it some key members, including lead designer Larry Tesler.

The Alto grew out of ideas birthed at the Stanford Research Institute, whose head, Douglas Engelbart, is widely credited with inventing the mouse and many other elements of modern computing. The first mouse was a simple one-button box, its technology patented under the name of an “x-y position indicator for a display system.” But it turned a computer’s screen from something like a high-tech sheet of paper to a full-fledged space with its own geography, setting the stage for the changes to come.

Tesler and some other PARC team members were former disciples of Engelbart, and they packaged these elements in a remarkably small design that could be produced at scale. Much of the Alto’s software — email, word processing — looks familiar in a way that the command lines of even significantly newer machines don’t.

But as recounted in books like Michael Hiltzik’s Dealers of Lightning, Xerox executives ignored or outright feared many of PARC’s inventions, worried they would undermine its juggernaut photocopier business. (A major exception was the laser printer, which would pay off its investment many times over.) Then, as PARC struggled to get resources for the Alto, the co-founder of a then-small startup called Apple finagled a pair of software demos — and ended up seeing the Alto’s full capabilities, which outstripped their own early attempts at a visual interface.

Conversely, some of the Alto’s key players started questioning their loyalties. “I was getting better questions from the Apple management than I ever got from the Xerox management. It was clear that they actually understood computers,” Tesler recalled in an oral history with the Computer History Museum. “Xerox was basically still a copier company.” Soon after, Tesler and a few other PARC employees quit to join Apple, while Xerox translated the Alto into its own office computer, the Xerox Star.

Apple took some concrete elements from the Alto, like a heavier emphasis on the mouse. But through Tesler especially, it also committed to a couple of broader ideas. The first was the importance of what was dubbed modeless computing. Many early graphical interfaces were built with powerful layered sets of commands, which users could switch between by activating different “modes.” Modes were a huge element of Engelbart’s vision — a way to achieve vastly augmented intelligence.

The tradeoff was that users needed a base of arcane knowledge to really master modes, and the consequences of failure could be punishing. In the Alto’s mode-based word processor Bravo, you could enter a powerful editing shortcut mode and use a mere four keystrokes to select the entire document (e), delete it (d), and then move back into insert mode (i) and type a new letter that would irrevocably overwrite the file. But that also meant you could destroy an entire project by forgetting you’d opened edit mode and typing “edit.”

Tesler was a devoted opponent of modes, and he redoubled his commitment to that philosophy at Apple. “Why have people spend six months to become a user?” Tesler asked. “Why don’t we spend six months or six years even, if that’s what it takes, to make it really easy so people can learn it in six hours?”

The second idea was relying on tests to figure out how people were actually using computers. At the birth of the Lisa, “the phrase ‘human interface’ wasn’t in the terminology,” recalls Annette Wagner, who designed the Lisa’s icons before becoming one of Apple’s early Computer Human Interface team members. “There were no user interface designers.” Under Tesler, however, Apple began setting up formal tests of its designs. It would put new users in front of the Lisa and ask them to talk through what they were doing. The vision that emerged was the computer as a place — and, more specifically, an office.

The surface of a secretary’s desk isn’t the only — or necessarily the best — possible metaphor for computers. Engelbart’s early ’60s demo introduced many of the core ideas of visual interfaces without it. The Alto itself was built on a concept called the Dynabook, whose creator, Alan Kay, imagined it as an educational computer designed for children who might have never seen the inside of an office. During the Lisa’s development, interface designer Bill Atkinson took inspiration from the MIT Spatial Data Management System, a personalized computing environment known as “Dataland” with a map that users could fly over using a joystick. In the ’80s, Amiga released an operating system built on the metaphor of a utility workbench.

But by then, the major computing players were pitching their wares to an audience of administrative assistants and other office workers. “Engelbart’s idea was that the computer was a tool for augmenting the human mind, allowing us to solve the big problems in the world, in society,” says Hansen Hsu, historian at the Computer History Museum. It introduced the idea that knowledge workers could vastly amplify their capabilities with a better interface. At Xerox and then Apple, that idea was translated into creating the desktop of the future.

The benefits weren’t just practical — they were cultural. At computing havens like MIT, typing was an accepted part of coding. But in the business world, it was associated with secretarial — or women’s — work, not something executives should bother with. When PARC arranged demos for Xerox executives, the Alto’s graphics let it compose a visual application called “SimKit” that would let them simulate running a business without ever touching the keyboard. “It was all mouse-pointing and mouse-clicking,” recalled PARC researcher Adele Goldberg in Dealers of Lightning. “We knew these guys wouldn’t type. In those days, that wasn’t macho.”

Even without the Lisa or the Xerox Star, the idea could have ended up seeming obvious. As the Lisa team worked to nail down its design, they stumbled across a 1980 IBM research concept called Pictureworld, which imagined a then-nonexistent powerful computer that hewed as close to a desktop as possible: you wouldn’t just hit send on an email — you’d put it inside a virtual envelope and drop it in an outbox. But the IBM report portrayed Pictureworld as theoretical, and publicly, it made computers sound personable by describing their behind-the-scenes value for banking or flight-booking. “If living with computers makes you nervous, consider another unnerving possibility. Living without them,” warns one early ’80s ad above some clipart of a man hiding from a bank of mainframes.

And without testing, Apple’s vision of a “desktop” might have looked almost nothing like the one users expect today. The original Lisa design, for instance, didn’t use the now-ubiquitous system of files and folders. It considered the idea and discarded it as inefficient, settling instead on a text-based filer that asked increasingly specific questions about how and where to create, save, move, or delete a file.

The filer was considered the best system on paper, but as the team watched people use it, they realized it wasn’t any fun. The constant prompting, wrote designers Roderick Perkins, Dan Smith Keller, and Frank Ludolph in a 1997 retrospective, “made users feel that they were playing a game of Twenty Questions.” They raised their concerns with Atkinson, and the group workshopped an alternative that drew from Dataland and Pictureworld, then brought it to Lisa engineering manager Wayne Rosing.

But there was a problem: Twenty Questions had already been locked into the Lisa, and the deadline to ship was looming. Rosing didn’t want other teams to start adding new systems, and according to Herzfeld, he also had a bigger fear: if Apple co-founder Steve Jobs learned about the idea before it actually functioned, he might delay the entire schedule to work it out.

The result was a subterfuge that wouldn’t sound out of place in Halt and Catch Fire. Atkinson and the interface team spent two weeks building a prototype in secret, hastily quitting whenever they heard Jobs approaching. Jobs realized they were hiding something, made them show it off, and promptly fell in love with it — but, fortunately for Rosing, only after they’d hammered out most of the kinks.

Icons and folders didn’t, the team learned, make creating or moving files around more efficient. But users universally preferred them to playing Twenty Questions. They invited people to explore the interface with the kind of familiarity they might grant a physical space. “The screen became, in some sense, real,” the Lisa’s creators wrote later. “The interface began to disappear.”

To look at the Lisa now is to see a system still figuring out the limits of its metaphor. One of its unique quirks, for instance, is a disregard for the logic of applications. You don’t open an app to start writing or composing a spreadsheet; you look at a set of pads with different types of documents and tear off a sheet of paper.

But the office metaphor had more concrete technical limits, too. One of the Lisa’s core principles was that it should let users multitask the way an assistant might, allowing for constant distractions as people moved between windows. It was a sophisticated idea that’s taken for granted on modern machines, but at the time, it pushed Apple’s engineering limits — and pushed the Lisa’s price dramatically upward. And as Apple was wrapping up the Lisa, it was already working on another machine: the cheaper, simpler Macintosh.

“The problem that both Xerox and Apple ran into with a $10,000 machine is that the users end up being secretaries, and no company is going to want to buy a $10,000 machine for a secretary,” says Hsu. “It really needed the Macintosh to bring that cost down to a quarter of that.”

And after all that, says Hsu, the real breakthrough for graphical interfaces wasn’t that it made the virtual world more familiar — it was that you could more easily push things into the physical one. “It wasn’t really until desktop publishing became available, with PageMaker and PostScript and the laser printer, that [you got] a compelling use case for a graphical user interface-based computer — something that you could not do with a command-line-based computer.”

Non-graphical interfaces never completely went away. At Apple, modes were resurrected in the form of keyboard shortcuts, a system that’s hugely powerful but mysterious enough that even the most experienced users will periodically find themselves surprised. Sure, engineers regularly dip into the command line 40 years after the Lisa’s launch. But for most people, a graphical system is all they’ve ever known.

The metaphor of the desktop has slowly given way in the past years. The skeuomorphic approach that made Apple’s early computers so powerful became widely derided on the iPhone, where it took the form of faux-pine paneling and yellow legal pads — until Apple gave it up for a “flat” look in 2013.

But the logic of user testing has turned into a standard part of computing, including at Apple. “The whole idea of what you see is what you get, having an icon-driven user interface, paying attention to whether somebody could use something or not — all of that, I think, came out of the Lisa, whether the Macintosh team wanted to admit it or not,” Wagner says.

And the future of computing may be even more user-dependent. Over the past years, Meta has pushed the development of a nerve-reading wristband controller that learns and adapts to users rather than the other way around. Natural language systems — like Apple’s own Siri as well as newer tools like ChatGPT — are supposed to be so intuitive they’re like speaking with a person… even if we’re likely adapting to them as much as they are to us.

As for the Lisa, “certainly the ideas were in the air” for graphical computing in the 1980s, Hsu says. But “the extent that those ideas would have become as dominant as they are, I think, is a different question. We could live in a world where maybe half the world still uses command line interfaces, and half the world uses graphical user interfaces. Who knows?”

Correction, January 31st 10:30AM ET: Apple’s iOS 7 redesign was in 2013, not 2017 as originally stated. We regret the error.