A new report by The New York Times on the internet's energy consumption estimates that data centers worldwide use 30 billion watts of electricity, including as much as 10 billion watts in the United States alone. According to McKinsey & Company, in a report commissioned by the Times, between 6 and 12 percent of that energy powers actual computations; the rest keeps servers running in case of a surge or crash. "This is an industry dirty secret," an anonymous executive told the Times. Other sources quoted in the story call the growing energy needs of servers unsustainable and a "waste," the result of building up servers beyond current demands by companies that want "too many insurance policies" to avoid even a millisecond of downtime.

The cloud now costs 30 billion watts

"Power, Pollution and the Internet" is the first entry in a multi-part investigative series on the environmental impact of cloud computing. Judging by this opening salvo, the series promises to be sharply critical of the technology industry. It calls the industry secretive, slow to change its practices, and unrealistic in its presentation of the internet as an environmental boon — taking particular care to single out Google, Amazon, and Facebook for specific consideration.

The report also presents a distorted and outdated view of the internet and cloud computing. It focuses on frivolous media and entertainment, or "fantasy points and league rankings, snapshots from nearly forgotten vacations kept forever in storage devices." It doesn't really grapple with the cloud as an increasingly-essential element of infrastructure, powering industry, government, finance, and commerce, as well as personal communication and data storage.

The irony is that if the cloud weren't essential, we could just give it up or scale it back

In turn, that dilutes the story's impact. It's only when we recognize that the internet isn't a pointless distraction, but is becoming as fundamental to our lives as roads, plumbing, and petroleum, that we understand why data usage and energy costs continue to grow and grow. At that point, the environmental efficiency of data centers turns into an undeniable problem. If the internet weren't so important we could just scale it back, like styrofoam containers and aerosol cans. We can't do that.

This should be what fills us with urgency. We have to make data centers more efficient; we don't have any other choice.

The cloud and the environment

Concern over the growing energy costs of the internet is not new. In 2007, the Environmental Protection Agency issued a report to the US Congress predicting that data center energy usage would double by 2010, urging Congress and the industry to push for increased energy efficiency. A year ago, The New York Times asked Stanford University's Jonathan Koomey, an energy policy expert and one of the primary authors of that EPA study, to write a follow-up. That report showed that between 2005 and 2010, energy usage had increased by 56 percent worldwide and 36 percent in the US — a much more modest increase than predicted, and attributable to both the global recession and new energy-saving measures.

Now, at the Times' Room for Debate blog, Koomey is back with a response, arguing that the benefits of information technology outweigh the costs. New servers, laptop computers, and tablets are all more energy efficient than their predecessors. Today, says Koomey, data centers use only 1.3 percent of electricity worldwide. On the same blog, Google senior VP Urs Hölzle compares that number to the 25 percent used by transportation, touting Google's improvements to server technology, and citing claims that cloud computing can save billions of dollars in energy costs and millions of metric tons of carbon dioxide.

Is the way we use the internet environmentally reckless?

So the issue isn't that the cloud is consuming more energy and costing more money than it once did. Clearly it is, and it's a significant share of total energy costs. The issue is the overall trend line of energy consumption and whether we're realizing a net savings worth the costs. It's whether the technology industry is meeting or falling short on its claims to improve energy efficiency. Ultimately, it's whether the behavior of consumers and companies — given the real energy costs and environmental impact of the internet — is reasonable or reckless.

Now, it's overwhelmingly likely that for many New York Times readers much of this is new. The better infrastructure works, the more likely it will be taken for granted. Much of what we do on the internet is free, frictionless, and yes, even fun.

'It just works' isn't a great recipe for critical self-reflection

But the reason servers aren't using 100% of their energy for computation, and server farms aren't constantly running at the limits of their capacity, isn't because (as the Uptime Institute's Bruce Taylor tells the Times) "if you tell somebody they can’t access YouTube or download from Netflix, they’ll tell you it’s a God-given right." It's because that infrastructure powers our businesses, our schools, our police and fire stations, our banks and stock exchanges, and yes, our media. It's because those zippy data transfers help drive our economy, in the same way that the boom in turnpikes, canals, and railroads did 200 years ago. It's because the principle and policy of net neutrality doesn't distinguish between the goofy and the life-saving. It's because the failure of the internet, like the collapse of a bridge, can lead to genuine disasters.

Complaining that a server is only using 10 percent of its electricity on computing is like complaining that only ten percent of the human brain's neurons are firing at any given time. It's forgetting that a sharp spike in that electrical activity usually causes a seizure.

This is why the internet needs redundancy and resiliency. It's also why we need to make that redundancy count. Like any insurance policy, we've taken it out because we simply have too much to lose.