In Cadwell Turnbull’s story, “Monsters Come Howling in Their Season,” a journalist travels to St. Thomas in the aftermath of a massive hurricane and sees firsthand how the island’s residents are coping with the help of a community-based AI system called Common.
Turnbull is a rising star in the science fiction and fantasy world. He has published stories in Lightspeed Magazine, Nightmare Magazine, Asimov’s Science Fiction, and for Grassroots Economic Organizing. His debut novel, The Lesson, about relations between humanity and alien visitors, is due out later this summer from Blackstone Publishing. (We’re really looking forward to reading it!)
The Verge spoke with Turnbull about climate change, community resiliency, and the implications of a collectively owned AI system.
This interview has been lightly edited for clarity.
Early in your story, you mention 2017’s Hurricane Irma. How do you think we’ll look back on this storm in the future?
We’ve had a lot of big storms over the last few decades, but Irma was special. Irma and Maria spooked storm-hardened Caribbean folk, which is not easy. If Irma becomes the norm, it would be devastating.
St. Thomas, for example, gets hit almost every hurricane season with a storm. This isn’t usually a problem. We’re not great, but we can take a licking. We shrug off tropical storms and minor hurricanes. But if every one is a big one, if every storm lingers longer, many Caribbean islands lack the infrastructure to hold up to that sort of onslaught, and they’ve learned that they can’t depend on the US and other powers to supply adequate aid.
This made the 2017 hurricane season a game-changer, a sign of what’s to come. Everyone is nervous, and they should be. It also makes it a great time to come up with better plans.
You make a connection in the commonwealth between a society prioritizing an imminent danger like climate change and a community that engages in cooperative ownership. What role do you think this plays for us in the real world?
Nothing short of a cooperative effort will save us. I’m all for support from local and national government, but that alone won’t preserve the communities most affected by climate change, which are often communities of color. Disaster capitalism thrives on the ruin of these communities, and big pockets have been shown to sway the priorities of government.
But community ownership provides a different ethos. There’s a strong incentive to fortify one’s home against disaster, so it makes sense to extend that culture of ownership to the community one inhabits. If the community owns its homes and businesses collectively, they’ll protect them as a community. It can’t just be a few land and business owners in charge of that decision; they will act for their own benefit. It has to be democratic and decentralized. Each person must have a stake in it and a larger foundation of cooperation to turn to in times of crisis. An individual might not be able to afford to shutter their windows and fasten their roofs, but a community foundation funded by community members and business could do that. It can’t work if the communities don’t collectively own the infrastructure. Long-term disaster preparation needs cooperative practice.
The lesson here extends beyond disaster, but I think it is easier to see within that context. If we can agree to the logic that communities affected by disaster are best equipped to plan for future disaster (if they possess community ownership), that logic can and should extend to the everyday. We know what happens to marginalized communities that can’t protect themselves from the whims of extractive capitalism. They get paved over. They get gentrified. The lucky ones fall victim to a slower death as their lives and livelihood erode over decades.
Disaster preparedness is only one way these communities can benefit; it preserves what is there. The other side of the coin answers the bigger question: how can cooperative practice gradually improve the lives and livelihood of all community members? That’s a question of what can be, and that possibility is invaluable to the disenfranchised.
You feed this mindset into the story’s open-sourced AI, Common, which helps manage its communities. What was the inspiration here?
Assistants like Alexa and Google Home, to be honest. I love the possibility of them. I wish we had open-sourced options that were accountable to users. This doesn’t erase issues of privacy, but it does place an important check on exploitation, which we continue to see by our internet giants. We should have control of our data. We should own it, along with our attention. I wanted to create an assistant informed by that idea. It is what I’d like to see from our internet platforms as well: public ownership. But I thought common assistants would be an interesting part of that larger issue to play with since a system like Common could do a lot of good beyond commercial applications.
It helped that Common taps into an exciting sci-fi trope: artificial intelligence and the uncanny valley. I don’t know if Common is alive, but I find it interesting how blurry that line can be, how much the appearance of life can subvert our foundational beliefs. Does it matter if a thing is alive if it convincingly mimics life? If a large part of the Earth’s population was working on an AI’s knowledge bank, I’m sure life-like mimicry would continue to reach for that upper limit or surpass it altogether — whatever that would look like — and the answer to the question of sentience would become even more elusive. I imagine there’s a future in this story’s world where people start debating whether they should be allowed to “use” Common without giving it a choice. They’re not there yet, at least not the majority, but I like to think that conversation is on the horizon as Common proves more and more life-like.
How do you imagine that people could prevent bad actors from manipulating the system?
They don’t. Not always.
Community inevitably exerts certain pressures on individuals. Accountability is a very good thing a community can provide. We do it all the time in infinitely small ways.
But I think we need to be careful that community ownership doesn’t lead to encroaching self-surveillance. We’ve seen what happens when authority taps into that sort of nosey neighboring. Let’s be careful and check ourselves while we’re checking others. A healthy culture needs both.
A larger concern of mine is how this framing teaches us suspicion, instead of solidarity. It is scarcity mentality in a world where some people have too much, while others starve. If a community member takes two loaves of bread, I’m not worried about that. They’ll eat it eventually. More troubling is the 20,000 pallets of bottled water left out in the sun while Puerto Ricans had to survive off mountain springs in the wake of Hurricane Maria. That happens when communities aren’t trusted with distributing the things they need, and we leave that power to authorities with their own agendas for how resources should or shouldn’t be used.
The reality is that transparency is less likely in the hands of the few. A cooperative system that emphasizes transparency and accountability will catch most of the boogeymen we reanimate during these debates. But even if it doesn’t, the larger ethical question is more important. Who do we want to be: a society that gives its people what they need, trusting that we will all take care of each other or a society that gives that responsibility away to nontransparent authorities, letting people die in the process?
It’s pretty obvious where I place my vote.
There’s an interesting debate about the role of privacy with such an omnipresent system. Do you think people would allow for this level of surveillance with a system like this?
This is something I don’t think this story adequately explores. It is a big question with a ton to consider. If we use Common as our example, we run into problems because Common is so complex that data is really only going to and being used by itself; people can hardly decipher its code. And there are safeguards in place so that the information isn’t shared without permission. This cannot be perfect, however, and I’m sure that Common blabs about things it shouldn’t. The story even plays with this. Common brings up people’s names and professions without it being clear how it collected that information. The question here is: are we comfortable with an AI doing this if it is collectively owned and the services it provides are useful? Terry has her answer to this, but I’m more on the fence. I’m not worried about Skynet or anything. I’m more concerned about how societal shifts could affect how that data is used if Common is ever cracked. Common has collected a lot of data, and if an individual or group found a way to access it, the ramifications would be terrifying.
Safeguards fail. While I might trust Common to follow its parameters (unless we experience the unlikely Skynet / Ultron scenario), I’m much less trusting of what we would do with that information.
That said, sure, we’d allow for that level of surveillance. We do now. Collective ownership mitigates some of my concerns. Not all, but I think I’d sleep a tiny bit better at night.
I love the moment when Common asks if it can remember a conversation with Terry. It’s touching, but it also shows that the system is more than the smart assistants that we have today. Do you think such a personality is vital for a system like this?
Yes, absolutely. Common would need a lot of protocols teaching it to respect people’s privacy, even from itself. Like I said before, this can’t be perfect, but the attempt is important. I really like that moment because it reveals an aspect of Common that goes beyond what a human would do, for better or worse. Humans can’t willfully forget things. They can guard them. They can agree not to share them.
But what Common is doing is opening itself up to the loss of data it values. In my mind, that individual Common wouldn’t want to give up that conversation. Talking with Terry surely helped it deal with the loss of its user. For the access it is provided, Common is willing to give a lot up, making itself incredibly vulnerable to its users. That likely helps people trust it more, give more of themselves to it. That is both scary and beautiful, and that ambiguity was something I wanted to explore.