clock menu more-arrow no yes

Filed under:

How Silicon Valley’s past predicts its future: an interview with Ellen Ullman

Ellen Ullman
Marion Etlinger

I first came to the writer Ellen Ullman through her second novel, By Blood, which I read in gulps. When I went to see what else she had published, I discovered her other job: programming. Her memoir Close to the Machine, which takes place during the late-90s internet boom, captured the emotional draw of creating code. And her first novel, The Bug, was inspired by her real-life experience with a particularly persistent bug — only the novel got much darker.

Her new book, Life in Code: A Personal History of Technology, is a series of essays beginning in 1994 and ending in 2017, coinciding almost perfectly with the rise of the Internet in most people’s lives. (You can read an excerpt here.) The first essay in the new book is the one that inspired Close to the Machine; it was written with “a beginner’s mind,” Ullman told me, “I never could write like that again. You can only get a beginners mind once.”

Part of the pull of Ullman’s work is that she’s aware of the humans who make technology, and the effect tech can have on ordinary users. But part of it is the undeniable style with which she writes; when she describes algorithms as beautiful, you believe her because her handling of the English language suggests a mind alive to beauty. Parts of the book, particularly around the Y2K bug scare in 1999, feel as though they wouldn’t be out of place in a history curriculum, but others — particularly a prescient discussion of artificial intelligence — still resonate.

In fact, reading the early parts of the book, it’s easy to discover how Silicon Valley took its current form. One piece, from 1994, contains a discussion of eugenics among programmers; when Ullman interrupts to object, an engineer “makes a reply that sounds like a retch. ‘This is how I know you’re not a real techie,’ he says.” It is an arresting line in any time, but especially with the latter-day rise of the online white supremacists, it feels prescient.

Other pieces, about how Ullman is treated by her male colleagues, foreshadow Google’s recent memo scandal — in which James Damore was fired for implying women were biologically unfit to code. It is hard to read Ullman without thinking of later Valley scandals, in fact: Ellen Pao’s sexual harassment case against the venture capitalist firm Kleiner Perkins, for instance; or Uber’s widespread discrimination against women, which eventually led to the downfall of then-CEO and founder Travis Kalanick. For anyone who is wondering how we got here, Ullman’s book provides essential reading about Silicon Valley programming culture.

But for all that, Ullman remains essentially optimistic about technology and the future, suggesting online courses to get started with programming, and urging people with an interest in the humanities to take up code. The history provided here is essential for anyone who wants to chart a path forward. I spoke to Ullman in late July about her book, Silicon Valley, and the politics of online protest. Invade the nerds, she urges; but be prepared for how much they’ll dislike it. After all, what existing group enjoys disruption?

This interview has been edited and condensed for clarity.

Were you at all surprised to arrive at where we are at this moment, where we have this second boom?

No, not at all. In the book, there’s a piece about the stock market charges, the investment charges, that was driving a crazy boom. There was a big question, to me, was there really value in the stocks? Was it really true that people would order dog food from the internet? It turns out they will, but not then. At that time, the populace was not ready for it, online shopping was still not the norm.

Am I surprised that there was another boom? Absolutely not. Technology and its development are far too powerful and becoming, as I feared from the very beginning, so integrated into essentially the capital areas of our lives. There was no turning back. It was only going to go forward. The question was, how exactly would it unfold as went forward?

There are companies that their valuations are based on their growth but not on their growth in profit. I worry about that, specifically when some of these companies go public, I think as we just saw with Snap. If the public had rushed in they would have lost a lot of money. I fear that the uninformed public has been kept out of the game of investing. These start ups are privately held, essentially, getting private funding. The general public sees, wow, all this is happening. All these valuations, how do I get in? Then the company goes IPO and the public jumps in. The insiders decide this company is not really that good, a Twitter or a Snap, and the stocks go down.

I fear that the general public who wants to rush in will wind up in the position they were in the first boom. They wanted to come into the gold rush and they were the last ones in. So far, I don't know. I don't know how many in the general public actually rushed into buy Snap or Twitter. They lost a lot of money if they did.

Now if venture capitalists lost money or Google lost money, fine. They have a lot of money. They are extremely wealthy investors. They knew the risks or they should have. They know what they are doing. If they lose money, they lose money. They expect that only one in 10 start ups will succeed. I'm not crying over Google losing a little money when they still have so much.

The culture you identify in the book has, I think, come into a very full and ugly blossom this year. The obvious example, of course, is Uber. The culture of win at any cost, cut whoever down. This hyper-competitive, really quite nasty culture that shows up very early in the book when you're emailing with co-workers. How has that culture developed and changed and in what way is it interacting with the boom now?

Sadly, I don't think the culture has changed. I think it's actually worse in the start-up world. Women, somehow, women are not given money to manage. You see it in Wall Street and you see it in the start-up world. The majority of money, majority of VC's are white men. The majority of the start up managers are white men or Asian men. It's tough for women right now to get in. It's really a gladiator culture. Uber is, unfortunately, the world I lived in. It hasn't gotten better. If anything it's gotten worse because Uber has tremendous reach.

When I saw that video of the CEO berating that taxi driver I was astounded that he would even speak like that. It's bad for all minorities. There are very few brown faces in the world of technology. You can watch the people marching down the start up boulevard of 2nd St. in San Francisco and you'll see who is in this culture and who is running it.

And then, because there's so much emphasis and focus on spectacular imagery and online community that people are almost neglecting the un-sexy community meeting where these thing get decided. What does that mean for us? What gets lost?

Not almost neglecting, are neglecting. It's too easy to get into a Facebook discussion. You feel you've done your part. I have to say, as someone who’s a writer, waving my hands and trying to write something that alerts people, am I any better than the people on Facebook? I ask myself that, believe me. But moving into the physical world, to make your local organizations, to get together with people who are near you, knowing your neighbors and so forth. What's happened is the idea of community has become global. That's really great in many ways. Organizing has to be local.

You need to meet the people. You need to know who they are. We need to come to some kind of trust and sense of responsibility in the group and so forth. Come back to the new idea of community or the old idea of community or the new/old idea of community, that you have to deal with who is on the ground around you, who you live with, who your neighbors are, your school board, the people running for local office. Who knows who their representatives are? Whether that will do any good but then there's local government.

I will say something here about Democrats that is really disheartening. The Republicans knew to start on the ground. They understood that they had to go to the local governments, the small organizations and work their way into the state houses, then the representative and so on. They wooed their entire state into being a Republican state. The Democrats have not concentrated on creating leadership at the lower level. That's part of what we need to do. There are some organizations that do travel to these small events that raise money for local representatives that are not getting funding. This is one of the ways to do it. This is not street protests. This is not really neighborhood, community organizing but it's one way to make change happen.

Right now I have very little faith in government. I'm afraid that feeling, that sense, is really deadly, to lose the idea of a republic. The web in many ways from the beginning has been, you don't need all these experts between you and everybody else. That began happening, I wrote about that in 1998. It was, you don't need all these brokers. They are just out for themselves, you are paying them money. Just go directly to a webpage. It was skipping over journalists, librarians and all sorts of editors and lawyers and brokers and agents to go directly to a website. That's where you're better served. You can trace back all the way to wherever you are now where Trump and Twitter. That arrow began flying in the late 90s and has landed at his feet.

This is not new. In many ways it is, in your words, the ugly blossom of the technology that started long ago. Yes, I don't know how to tell people. We need to get out of the internet now. We have to get out of the webpage, out of Facebook. As I said, there's a global community and then there's a local community. Without forming associations at the local level I don't see how political change can happen.

You can go back to Tocqueville writing about America: what he finds strongest here when he's traveling is local government. The federal government's almost an afterthought to him.

Right. It is sad to remember. It's sad that you bring that up. How we get back there? There's no back there. There is no going back. There has to be a way of taking this sense of how you talk to people and conform consensus or even if you can't form a consensus, agree to disagree and so forth. That is a new development that has to be acquired in a new way locally. You can use the internet to organize locally as well. There's no going back to Tocqueville coming to America. It is an ideal to be hopeful.

When I started reading your book the thing that really was very arresting, the section that opens with the line, "Real techies don't worry about forced eugenics." We’re in a cultural moment now where we have white supremacists again marching, along with this sort of lackadaisical response from Silicon Valley. How do you shift that culture to create something less toxic? For those of us who aren't programmers but who are, nonetheless, entangled in this world created by programmers, how do we navigate this?

My idea about people learning to program is just mainly to demystify it. I spoke with another person who interviewed me who said that she tried programming and she sounded too hard. She didn't like it and she stopped. I said, well, fine. Not everyone can be a programmer. The idea is to demystify it. You try it, well, it's not for me. You have to try it. You have to have a passion for it. You have to deal with failure laws. You have to deal with bug after bug after bug. If you're not comfortable with failure, you're not going to succeed as a programmer. The idea is just to demystify it, to understand that people wrote this and other people can change it. I will go into how that will be less and less true once we have machine learning and algorithms writing algorithms writing algorithms until they go into a place where the writers of the initial post don't even know what's going on anymore. It's code that writes code that writes code.

There's a lot of discussion in the computer science circle about the dangers of that. I think that this woman I spoke with she said if she thought about it she had a different, rounder perspective about technology after she tried to program. I think that's a good thing. That's why I thought about these online courses. My problem with it is you need a computer.

Form this small group, three of you, maybe four, and take one of these classes on your own. That way you can see the kind of culture and filter it out, talking among yourselves. Come out of it with the knowledge that if you turn out, if you find out you love this stuff, in fact that's a big if, when you go onto that first computer science class.

In that regard, coming armed, with some knowledge and the knowledge that you really want to do this, we've got to have that passion. You're going to face all this prejudice, and you will. It's not going to change, by the way. When you walk in that room you are going to face all this prejudice, over and over, in big ways and small. I'm not saying, tough it out. I'm saying, you're going to face it and have to find it in you to say you're not going to drive me back. I already know what I'm doing here.

It's a difficult task. All people face prejudice but do this, people of color and women. You have to stare it in the face. It's not going to go away just because everyone agrees it's a bad thing. When you get that first job, it's still going to be in your face. That's what I mean about coming prepared with some knowledge.

Will this change things? I sure hope so. I sure hope the more people who are involved in it when they're doing a business plan they say, we're going to serve certain zip codes. You can be like, what about these other zip codes? Maybe we need people over here in these lesser neighborhoods, as you see it, can be your customers. Who is the you? Who is the user?

I went to one of these events where angels have these events to people who want to do a start-up, would be CEO's of a start up. They make their presentations, sometimes it's one minute, sometimes it's two minutes, sometimes it's eight minutes. I had just been mostly appalled by what I'd seen. It's just the same old delivery services and valet services for wealthy people and so on. Afterwards there is all this networking. You must go out there and network. I guess it worked for some people. I think it's a horror, that thing. That's just me.

I went up to this guy, actually he came up to me. He was a guy and he told me about the software he's writing to scan resumes and get a good fit for the company. I said, finding a good fit means you're just going to find people that are already there, guys you’re comfortable with — so you’re perpetuating the prejudice and the segregation of the digital culture. He listened to me patiently while I went on. Then he went, all that may be true, but I'm working for the company not society. To me that was all too perfect another ugly blossom. This was right out there and direct. Who's the you on the other side? I'm working for you.

Look at all those ads about cars. Who's the you that they're selling to? I so wanted to tell him, I always look at ads. They say "You". I go, who? Who do you mean by that? This is what goes on in the start-up world. Can that change by people getting inside it? More women making pitches? Yes. More women have to make pitches and more women have to get into giving money. It's going to be a long haul. I fear that a lot of the things that women have gotten involved in involve clothing and crafts. That's fine. I buy a lot of clothing. Not because they're not being funded or pushing into those more technical areas. It's more difficult for them to get money. They're not seen as capable of that.

It's going to be a long push. I need to see the magic here. This culture is entrenched. It does flow into our lives in the most intimate ways. First of all, finding the interior noticing this and noticing the ways in which this is changing your behavior will be the first step.

My memory was jogged again when you were talking about the ways in which automating is entrenching some of our prejudices. Obviously AI learns from us — our prejudices, too. The AIs are just going to recreate it.

Yes.

The bit you wrote about AI was about a decade ago. We're starting to see another, sort of, spate of headlines about the promise of AI and the promise of algorithms. Have your feelings changed in any way since you last wrote?

I want to distinguish here between AI in cars and humanoid robots. What I wrote about humanoid robots has not changed. As a matter of fact, that has more or less become, at this point, the consensus of the computer science world. However, it's pretty likely to see papers put out that go back to the old rational game, playing chess is the highest level of intelligence on the earth and so forth. I do see that persist.

In general, the idea that knowledge is embodied and intelligence is embodied is fairly settled at this point. What's not spoken about enough is that knowledge is social. Our sense of who we are, our self, if you will, comes from social interaction. We are social beings, like other mammals. We are born helpless. If we don't, very quickly, learn who's safe, who's not, who wants to do me harm, who's a friend, human beings could not survive in the world. You can talk about fearing a saber tooth tiger or picking berries in the woods but the primary skill is noticing one person or another and knowing that other person has an interior life like yours. It's called theory of mind. That idea that it comes from social interaction still is not as strong as it might have been. Our whole sense of who we are and how we learn comes out of our interactions with other people. Cynthia Bielburg was very smart on this when she said, I don't have the exact quote, but, we build ourselves through interactions with other people. That's exactly right.

The big use of AI right now is in, you know, data sciences. It's a whole new profession called data sciences, who of course go for massive amounts of data to find out something about us in the future. They write algorithms to search for things. The big problem, I'm not definitely saying this because there are people smarter than us say it, it just learns from the present. It can't say, there might be new people out there who we haven't seen who might be great. If you’re looking for which zip codes are good for you, you only go to the ones who were good for you in the past, that are rich. The AI in itself can't explore. It doesn't ask the questions unless the human beings directed it.

I could take you into cars and that is a really long discussion. I'll say this one thing. We just found out, news came in today that a Tesla car was hacked even though it was patched after someone in China hacked it. The patch didn't work and it has been hacked again. So, first of all any car that depends upon the internet can be hacked. The thing is that we call it self-driving car but it really gets to what it is if you call it a driverless car. Obviously, what we're taking out is the human being. It doesn't go into the fact that human beings have 100 years or so of relating to cars and how human beings deal with other cars. Now we're error prone, that's true.

This is an example. The first crash I heard of on the Google car came at a four way stop. Now, it followed the rules. Do you even know the rules and how you deal with a four way stop? Can you pass your driving exam by knowing this rule?

The person who is to the right, if both of you arrive at the four way stop at the same time, goes first. But most of the time you want to be alert because frequently drivers will wave you on.

Exactly. So what the car did is use that rule. The cars came at the same time. The person in the car that comes first goes and then the other two are ties. It lets the one on the right go. It probably crashed in the intersection, a slow crash. It turns out that human beings signal each other with their eyes. You perceive it in a flash. You don't even think about the fact that you kind of exchanged eye contact with the car. Much more, you read the car itself. When you see a certain car barreling toward the intersection going fast you just say, okay, I'm letting that guy go. Right? You don't have to wait for eye contact. You just know I'm gonna hang back. This goes in general for cars even on the controlled path of the freeway. You just don't look at the proximity around you. You look far ahead. In your rear view you can see a car changing lanes wildly. You know when that car gets to your side you're going to give it room because it's going to cut right in front of you.

Let's not talk about inattentive drivers. I'm talking about can AI incorporate what human beings know by interacting with other human beings? That is the big question. Can a car incorporate the knowledge of human beings which happens because we are in society, because we look ahead, because we can read other cars, we can read the passengers/drivers in other cars. This is the sense that we're bodies — and can that be translated into what a driverless car can do? To me that's a big question in addition to the technical problems of being hacked and one that even more problematic and common is that these cars from different manufacturers are going to have to agree on the standards of reaction and the standard interface. The rest of it will be a black box. It's proprietary information.

Nothing makes systems more error prone and buggy than having a situation like that. These are proprietary black boxes. Every one has ... They agree on this interface but people view it slightly different on their side. It's not real documented and suddenly these interfaces don't work. They are terribly error prone. Anyone who works with systems will tell you that. There are a lot of problems ahead. It will come with time. I have no doubt that driverless cars, human beings not driving a car are in our future. What's working now seems to be in places where these towns, I don't know if it's Arizona, are building an infrastructure around their cities that's a planned area that has all these fees and structure that can incorporate these cars.

Now can you imagine that in New York? Can you? People are New Yorkers. Look, the subway is 100 years old plus and it runs 24/7 so it's a miracle, right? Can you imagine people paying enough taxes in New York to fix the subway and create an infrastructure for a self driving car? I don't see it soon. There are a lot of obstacles right now. That doesn't mean it won't happen. The question is can those human values and abilities be incorporated into this picture?

That seems to me to really be the core theme of the work of your book. The idea that our social values, the things that are often, sort of, tossed aside is not really being technical are in fact what this technology is meant to be about. The technology is social like human knowledge is social. That's most obvious.

It seems to be something that is missing in a way, the idea that technology should be informed by society rather than the other way around. There is this kind of give and take relationship. I'm curious to know, before I let you go, what do you make of social media

I was going to say, let's take Facebook, for example.

Facebook is a perfect example because Facebook is so afraid of human judgment that they fired a bunch of editors and now every time something goes wrong they point to an algorithm and they say, the algorithm did it, as though nobody wrote the algorithm.

What's trending wound up with Nazi sites and all this fake news, which, unfortunately, is the kind of thing that gets adopted by people who don't go there. They had to go back to human editors. Just trusting that algorithm didn't work because you can only judge what's trending by the number of clicks. If there are a lot of people going to this weird information, that's what will pop up first. In these kinds of situations you need a human being kind of evaluating it.

Google, even Google, puts some human beings in there to watch what's going on. It has to because it has to keep adjusting its algorithms because everyone changes their algorithms as soon as they figure out how to get farther up on the page view. Google will tweak and tweak. They have to follow what's going on and react as human programmers.

My first feeling when I saw Facebook was I remember when the big deal was to have your own webpage. It was so wonderful because they weren't standardized. They didn't look like Microsoft or Apple. It was your screen. You did whatever you want with it. What's happened is on the web, of course the software becomes routinized, don't get me started how they change the software at will. One minute you're an expert user of whatever site you're going to, the next minute you're a beginner and an idiot clicking around.

Facebook, first of all it loses the kinds of interactions you can have. That may be good. Not everyone can make a webpage. However, it kept changing it. I mean, three months ago there was a timeline to a wall or was it wall to timeline. Now it's status. All these people are using status to say, "I went, you know, to see my parents and here are all the pictures." It's not your change of status if you're using it to say whatever I'm doing right now and what I want to talk about. At some point Facebook just completely changed the format. It's not only creating an area where people can talk. I admire the kind of discussions that can be created. I don't admire the kind of nasty discussions that can be created but this is the world and people can say what they want.

I don't like the idea that Facebook controls how people express themselves and changes it periodically according to whatever algorithms they use to figure out what they should do or the whim of some programmer or some CEO. That bothers me a great deal. There's no going back from Facebook. Again, it's not like, okay I'm not going to use Facebook. I insanely don't use Facebook except with my closest friends. All those people telling you what they're doing. It's nice that I haven't seen that person in five years and I'm unlikely to go to their event in Boston. It's nice to hear about it from time to time. I can't keep track of hundreds of people. I don't go out there looking for friends. I use it. I don't rely on it for my social life.

When there's someone I really care about I like going through email. I like the occasional phone call. I feel I can actually have a discussion that's shielded from other people. We're not showing off what we're talking about. We're talking with each other. To me that's the crucial difference.

There's no going back from Facebook. A billion people can't be wrong, right? Where it goes from here, that's what I fear. I'm not sure it's driven by people saying, this is what I'd like and organizing to say, this is what I like. What's done comes from above.

Right. It comes from what's going to make Mark Zuckerberg money. That's what the decisions are made from, not from what people like, not from what people want. The way that he sees what people like and want aligning with his profit incentives.

I just don't understand sometimes why even change the format of something that's so successful, I mean really change it. People protested. It was like, what? This is the way I was communicating with all these people. You mean I can't do that anymore? Yet, he went ahead. It's sort of like the new Coke. I don't want this Coke, give me back the old one. They did. They were smart enough to know that when you have a good thing you don't mess with everybody's Facebook. Facebook went ahead and did it anyway. That's what I'm saying. It's this constant thing of, you know, it's not a wall. What the hell's a wall? Timeline I kind of understand but why's it timeline? The whole organization of it makes no sense. Everybody's telling me they're changing their status, which means they woke up yesterday and they woke up today and did something.