Off Route 126, past the Ochoco Wayside State Park in Prineville, Oregon, lies a series of monolithic, logo-less gray buildings guarded by security. At first glance, you might think it was a government complex, or maybe an energy company or large-scale manufacturer with a name you do not recognize and would soon forget. Instead, our local cab driver — steering up a steady incline in a beat-up Dodge van — points to the right and says, “That’s all Facebook,” with a matter-of-fact tone.
The sentence seems out of place, given we’ve just been ferried about four miles from a tiny city of about 9,250 people, from the entrance of a Best Western next to diner called the Apple Peddler. But it’s a reality for the citizens of Prineville, who share their home with Facebook’s first and largest data center, located at the appropriately named 735 Connect Way. The site now totals more than 1 million square feet of both active and in-progress buildings. It’s home to countless Facebook server racks, solar panels, cooling systems, and, of course, data, which is funneled in and out at rates and magnitudes incomprehensible to most non-engineers.
To accurately describe the data center’s purpose — how it functions and what it stores — requires a healthy amount of expertise. Last week, I joined the company’s first open press tour in nearly three years. We were guided by Ken Patchett, Facebook’s director of Western data center operations, who tried to explain the finer details of handling a significant chunk of the social network’s traffic and housing US users’ data.
Facebook is building a third building in Prineville, set to open in December.
"We’re using 20 gallons [of water] a minute for both buildings," he tells us of the cooling operation, which he says is low compared to standard municipal water usage. The Prineville buildings draw in the dry Oregon air, through large rectangular slits in the upper halves of the walls, and mixes it with water and server exhaust to achieve an ideal temperature of between 60 degrees and 80 degrees Fahrenheit. Patchett talks of voltage and wattage — 277V per cabinet he says — and transformers and megawatts, educating us about air filter quality and the reduction of static electricity in the air.
If it sounds like a college course on electrical engineering, you would not be mistaken. The data center is a mystical place to the average Facebook user. Yet much of what happens onsite revolves around efficiency: power consumption is as low as possible; air is cooled to the ideal temperature; exhaust is dealt with appropriately; servers are fixed in a timely manner; every single cable and server and rack has a barcode that is scanned. There is no brain or heart or giant blue-hued and humming artificial intelligence calling the shots. Around 165 people manage the entire operation daily.
"This is not a cloud. We put petabytes and petabytes of data through here."
"This is not a cloud," Patchett tells the group as he stands among row after row of server racks in an underground level of PRN2. The building is the second and larger — 450,000 square feet, to be precise — of the site’s two operational data centers. A third is on the way, opening in December. "We put petabytes and petabytes of data through here," Patchett says. A petabyte is one million gigabytes of data. It's a lot of Facebook photos, he adds.
Patchett calls Prineville the heartbeat of Facebook, but it’s more accurate to say it is one of many heartbeats. Facebook relies on numerous data centers around the US. It does not disclose the exact number, but it’s currently building one in Texas and has another operational in Forest City, North Carolina. Around the globe, the company relies on third parties because it is not cost-effective to build a data center on multiple continents.
Here in the US, Prineville is the company’s crown jewel, a place that is as functional as it is symbolic for the company’s philosophy on sharing. With Prineville’s first building under construction back in 2011, Facebook announced the Open Compute Project, an open-source community designed to facilitate the exchange of data center hardware and infrastructure design.
"We’re not in the business of having secret things," says Kevin Lee, a technical program manager at Facebook. Lee works on server design and oversees Big Sur, an AI training system developed in collaboration with Nvidia. "Our goal is to understand the world, to push AI." In doing so, the company wants it and its competitors to borrow each other’s secrets, by keeping them in plain sight.
Though it openly shares its server designs, including that of Big Sur, Facebook does not disclose how many servers it users, how much traffic Prineville handles, or how much data it stores. The company says every bit of data is stored redundantly — in other words, there are numerous backups in the event of outages, power grid failures, and natural disasters.
When asked if there is a cut-and-dry way of determining the physical location of a Facebook connection or scrap of data, Patchett says the site handles the flow more or less algorithmically. So there is no easy answer for where my own, or any other US Facebook user’s, data may be at any given moment. It’s stored in multiple locations and it travels at near-light speed, refracted by the glass and plastic of fiber optic cables and interpreted and understood by millions of servers worldwide. In that sense, Facebook’s biggest data center is as mysterious as it is physically impenetrable.
As Patchett ends our tour, he shows the group a wall of photographs in PRN1 designed to chart the progress of human communication, from cave paintings to the printing press and onward to the typewriter and smartphone. One day, Facebook’s data center will have its own place on the wall, understood as a rudimentary way we shared, stored, and moved bits in the 21st century. But right now, it feels beyond the realm of everyday understanding.
When you type facebook.com into a web browser or click on an item in the News Feed on your iPhone, something might happen deep in Prineville — a momentary blink of blue light. It’s just a single data point, one of probably trillions for all we really know.
Photography by Vjeran Pavic