Off Route 126, past the Ochoco Wayside State Park in Prineville, Oregon, lies a series of monolithic, logo-less gray buildings guarded by security. At first glance, you might think it was a government complex, or maybe an energy company or large-scale manufacturer with a name you do not recognize and would soon forget. Instead, our local cab driver — steering up a steady incline in a beat-up Dodge van — points to the right and says, “That’s all Facebook,” with a matter-of-fact tone.
The sentence seems out of place, given we’ve just been ferried about four miles from a tiny city of about 9,250 people, from the entrance of a Best Western next to diner called the Apple Peddler. But it’s a reality for the citizens of Prineville, who share their home with Facebook’s first and largest data center, located at the appropriately named 735 Connect Way. The site now totals more than 1 million square feet of both active and in-progress buildings. It’s home to countless Facebook server racks, solar panels, cooling systems, and, of course, data, which is funneled in and out at rates and magnitudes incomprehensible to most non-engineers.
To accurately describe the data center’s purpose — how it functions and what it stores — requires a healthy amount of expertise. Last week, I joined the company’s first open press tour in nearly three years. We were guided by Ken Patchett, Facebook’s director of Western data center operations, who tried to explain the finer details of handling a significant chunk of the social network’s traffic and housing US users’ data.
Facebook is building a third building in Prineville, set to open in December.
"We’re using 20 gallons [of water] a minute for both buildings," he tells us of the cooling operation, which he says is low compared to standard municipal water usage. The Prineville buildings draw in the dry Oregon air, through large rectangular slits in the upper halves of the walls, and mixes it with water and server exhaust to achieve an ideal temperature of between 60 degrees and 80 degrees Fahrenheit. Patchett talks of voltage and wattage — 277V per cabinet he says — and transformers and megawatts, educating us about air filter quality and the reduction of static electricity in the air.
If it sounds like a college course on electrical engineering, you would not be mistaken. The data center is a mystical place to the average Facebook user. Yet much of what happens onsite revolves around efficiency: power consumption is as low as possible; air is cooled to the ideal temperature; exhaust is dealt with appropriately; servers are fixed in a timely manner; every single cable and server and rack has a barcode that is scanned. There is no brain or heart or giant blue-hued and humming artificial intelligence calling the shots. Around 165 people manage the entire operation daily.
"This is not a cloud. We put petabytes and petabytes of data through here."
"This is not a cloud," Patchett tells the group as he stands among row after row of server racks in an underground level of PRN2. The building is the second and larger — 450,000 square feet, to be precise — of the site’s two operational data centers. A third is on the way, opening in December. "We put petabytes and petabytes of data through here," Patchett says. A petabyte is one million gigabytes of data. It's a lot of Facebook photos, he adds.
Patchett calls Prineville the heartbeat of Facebook, but it’s more accurate to say it is one of many heartbeats. Facebook relies on numerous data centers around the US. It does not disclose the exact number, but it’s currently building one in Texas and has another operational in Forest City, North Carolina. Around the globe, the company relies on third parties because it is not cost-effective to build a data center on multiple continents.
Here in the US, Prineville is the company’s crown jewel, a place that is as functional as it is symbolic for the company’s philosophy on sharing. With Prineville’s first building under construction back in 2011, Facebook announced the Open Compute Project, an open-source community designed to facilitate the exchange of data center hardware and infrastructure design.
"We’re not in the business of having secret things," says Kevin Lee, a technical program manager at Facebook. Lee works on server design and oversees Big Sur, an AI training system developed in collaboration with Nvidia. "Our goal is to understand the world, to push AI." In doing so, the company wants it and its competitors to borrow each other’s secrets, by keeping them in plain sight.
Though it openly shares its server designs, including that of Big Sur, Facebook does not disclose how many servers it users, how much traffic Prineville handles, or how much data it stores. The company says every bit of data is stored redundantly — in other words, there are numerous backups in the event of outages, power grid failures, and natural disasters.
When asked if there is a cut-and-dry way of determining the physical location of a Facebook connection or scrap of data, Patchett says the site handles the flow more or less algorithmically. So there is no easy answer for where my own, or any other US Facebook user’s, data may be at any given moment. It’s stored in multiple locations and it travels at near-light speed, refracted by the glass and plastic of fiber optic cables and interpreted and understood by millions of servers worldwide. In that sense, Facebook’s biggest data center is as mysterious as it is physically impenetrable.
As Patchett ends our tour, he shows the group a wall of photographs in PRN1 designed to chart the progress of human communication, from cave paintings to the printing press and onward to the typewriter and smartphone. One day, Facebook’s data center will have its own place on the wall, understood as a rudimentary way we shared, stored, and moved bits in the 21st century. But right now, it feels beyond the realm of everyday understanding.
When you type facebook.com into a web browser or click on an item in the News Feed on your iPhone, something might happen deep in Prineville — a momentary blink of blue light. It’s just a single data point, one of probably trillions for all we really know.
Photography by Vjeran Pavic
- The entrance to Facebook's PRN1 data center in Prineville, OR, appropriately located at 735 Connect Way.
- Servers housed in PRN2, the second of Facebook's data centers. At 450,000 square feet, it's 29 percent larger than the PRN1.
- Facebook is secretive regarding how many servers it uses at Prineville, as well as how much traffic it handles and how much data it stores.
- Facebook designed nearly every inch of its data center itself, including the servers, the server racks, and the wiring.
- Facebook's servers generate an enormous amount of heat, most of which is sucked up through exhaust systems in the ceiling, where it's either reused in the cooling process or sent out of the building.
- Kevin Lee, a technical program manager at Facebook, shows reporters Big Sur, a Nvidia-powered AI training system, at PRN2.
- Big Sur, like every other custom server design Facebook develops, is an open source project and publicly available online.
- Ian Buck, Nvidia’s VP of accelerated computing, stands next to a picture-painting AI system being trained by Big Sur at PRN2.
- This AI was fed more than 12,000 French impressionist paintings and learned to create its own art after just 30 minutes of Big Sur training.
- Facebook's data center allows air to be pulled in through open slits in the building, where it mixes it with hot server exhaust and runs it through water to achieve the ideal temperature.
- Facebook's air filters help in the cooling process, keeping the air free of any contaminants and managing temperature and humidity levels.
- Once air makes it way into the building and through the filters, it is measured for unsafe levels of static electricity before the cooling process.
- Facebook can't make use of all the heat generated by its servers in the cooling process, so any excess air must be pushed out of the building in the relief fan room.
- Facebook chose Oregon for the location of its first data center in part because the state has moderate temperatures and dry air, which is ideal for cooling.
- Facebook runs a mobile device lab at Prineville where it runs new versions of the company's apps to test how they affect performance on older phones.
- The mobile device lab checks phones as old as the iPhone 4 and Nexus 5 to ensure its apps don't adversely affect battery life. Facebook has nearly 2,000 phones, spread out across 60 server racks with 32 units each.
- Facebook uses solar panels to help generate electricity for office operations, part of the reason why the Prineville data center is considered more efficient than the industry standard.
- Facebook designs all of its own servers and racks, and the company makes those designs open source so members of the Open Compute Project can use them.