The “Energy Transition” Won’t Happen
The laptop class has rediscovered a basic truth: foundational innovation, once adoption proceeds at scale, is followed by an epic increase in energy consumption.
It’s an iron law of our universe.
To illustrate that law, consider three recent examples, all vectors leading to the “shocking” discovery of radical increases in expected electricity demand, now occupying headlines today.
First, there’s the electric car, which, if there were one in every garage, as enthusiasts hope, would roughly double residential neighborhood electricity demands.
Next, there’s the idea of repatriating manufacturing, especially for semiconductors. This is arguably a “foundational innovation,” since policymakers are suddenly showing concern over the decades-long exit of such industries from the U.S.
Restoring American manufacturing to, say, the global market share of just two decades ago would see industrial electricity demand soar by 50 percent.
And now the scions of software are discovering that both virtual reality and artificial intelligence, which emerge from the ineluctable mathematics of machine-learning algorithms, are anchored in the hard reality that everything uses energy.
This is especially true for the blazing-fast and power-hungry chips that make AI possible. Nvidia, the leader of the AI-chip revolution and a Wall Street darling, has over the past three years alone shipped some 5 million high-power AI chips.
To put this in perspective, every such AI chip uses roughly as much electricity each year as do three electric vehicles. And while the market appetite for electric vehicles is sagging and ultimately limited, the appetite for AI chips is explosive and essentially unlimited.
Consider a recent headline in the Wall Street Journal: “Big Tech’s Latest Obsession Is Finding Enough Energy”—because the “AI boom is fueling an insatiable appetite for electricity.” And, as Reuters reports, “U.S. electric utilities predict a tidal wave of new demand . . . . Nine of the top 10 U.S. electric utilities said data centers were a main source of customer growth.”
Today’s forecasts see near-term growth in demand for electric power three times as great as in recent years. Rediscovery of the iron law of growth inspired an urgent Senate hearing on May 21 entitled “Opportunities, Risks, and Challenges Associated with Growth in Demand for Electric Power in the United States.” (Full disclosure; a hearing at which I testified.)
Data centers, the information “powerplants” at the center of the cloud revolution, are flagged as the primary culprit for this exploding power demand. These warehouse-scale buildings are chock-full of all manner of computer chips, including conventional processors, memory chips, and communications chips. And now datacenters are pouring AI chips into the mix as fast as manufacturing plants can build them.
As one researcher notes, adding AI to Google “search” boosts the energy use per search tenfold. And that’s only the first, perhaps the least, significant of the many possible applications for AI.
As one senior operative at Friends of the Earth recently put it: “We can see AI fracturing the information ecosystem just as we need it to pull it back together.” The fracturing is not about AI and child safety, or deep fakes, or the looming threat of new regulations. It’s about aspirations for an “energy transition” in how the world is fueled.
It is inconvenient, to put it mildly, to see demand for electricity—especially reliable, 24–7 supply—take off at the same time as regulators are forcing utilities to shut down conventional power plants and spend money on costlier and less reliable power from wind and solar hardware. The epiphany that transition aspirations and the power realities of AI are in conflict was epitomized in a recent New Yorker essay titled, “The Obscene Energy Demands of A.I.” The article’s subtitle asks:
“How can the world reach net zero if it keeps inventing new ways to consume energy?” The question answers itself.
The challenge is not only the need for far more electricity than forecast a mere year or so ago but also the need for it to be both inexpensive and available precisely when needed—and soon. New factories and new datacenters are coming online rapidly with many more coming in a few years, not decades.
There aren’t many ways to meet the velocity and scale of electric demand coming without a boom in building more natural-gas-fired power plants.
This seemingly sudden change in the electricity landscape was predictable—and predicted. Almost exactly 25 years ago, my long-time colleague Peter Huber and I published articles in both Forbes and the Wall Street Journal pointing to the realities at the intersection of energy and information. (A decade ago, I also published a study on the matter, which, it turns out, accurately forecast electric demands from data, and I more recently expanded on that theme in my book The Cloud Revolution.)
At the time, we were nearly alone in making such observations in the public-policy space, but we were far from alone in the technical community, which has long recognized the power realities of information. Indeed, in the engineering community, the convention for talking about the size of datacenters is in terms of megawatts, not square feet.
There’s a full-on race in the tech industry, and in tech-centric investment communities, to spend billions of dollars on new AI-infused infrastructures. The furious pace of expanding manufacturing to produce AI-capable silicon chips and simultaneously building massive, AI-infused datacenters is shattering the illusion that a digital economy enables a decoupling of economic growth from rising energy use.
As recently as two years ago, an analysis from the OECD (an organization in the vanguard of the “energy transition” vision) concluded: “Digital transformation is increasingly recognised as a means to help unlock the benefits of more inclusive and sustainable growth and enhanced social well-being.
In the environmental context, digitalisation can contribute to decoupling economic activity from natural resource use and their environmental impacts.” It turns out that the physics of power and information neutered that aspiration.
Now the key question for policymakers and investors is whether the current state of affairs is a bubble or signals a more fundamental shift. Just how much more power will information consume? It is now conventional wisdom to see the digital economy as vital for economic growth, and that information supremacy matters both for economies and for militaries.
But the core feature of an information-centric economy is in the manufacturing and operation of digital hardware—and unavoidably, the energy implications of both.
To see what the future holds, we must take a deep dive into the arcana of today’s “cloud,” the loosely defined term denoting the constellation of data centers, hardware, and communications systems.
Each datacenter—and tens of thousands of them exist—has an energy appetite often greater than skyscrapers the size of the Empire State Building. And the nearly 1,000 so-called hyperscale datacenters each consume more energy than a steel mill (and this is before counting the impacts of piling on AI chips). The incredible level of power use derives directly from the fact that just ten square feet of a datacenter today has more computing horsepower than all the world’s computers circa 1980.
And each square foot creates electric power demands 100 times greater than a square foot of a skyscraper. Even before the AI revolution, the world was adding tens of millions more square feet of datacenters each year.
All that silicon horsepower is connected to markets on an information highway, a network whose scale vastly exceeds that of any of its asphalt and concrete analogues. The universe of communications hardware transports bytes not only along “highways” comprised of about 3 billion miles of glass cables but also along the equivalent of another 100 billion miles (that’s 1,000 times the distance to the sun) of invisible connections forged by 4 million cell towers.
The physics of transporting information is captured in a surprising fact: the energy used to enable an hour of video is greater than the share of fuel consumed by a single person on a ten-mile bus ride.
While a net energy-use reduction does occur when someone Zooms rather than commutes by car (the “dematerialization” trope), at the same time, there’s a net increase in energy use if Zoom is used to attend meetings that would never have occurred otherwise.
When it comes to AI, most of what the future holds are activities that would never have occurred otherwise.
Thus, the nature of the cloud’s energy appetite is far different from that of many other infrastructures, especially compared with transportation.
For transport, consumers see where 90 percent of energy gets spent when they fill up a gas tank or recharge a battery. When it comes to information, though, over 90 percent of energy use takes place remotely, hidden away until utilities “discover” the aggregate impact.
Today’s global cloud, which has yet to absorb fully the power demands of AI, has grown from nonexistent, several decades ago, to using twice as much electricity as Japan. And that estimate is based on the state of hardware and traffic of several years ago.
Some analysts claim that, as digital traffic has soared in recent years, efficiency gains were muting or even flattening growth in datacenter energy use.
But such claims face countervailing factual trends. Since 2016, there’s been a dramatic acceleration in datacenter spending on hardware and buildings, along with a huge jump in the power density of that hardware—and again, all of this before the AI boom.
To guess what the future holds for the energy appetite of the cloud, one must know two things: first, the rate at which efficiency improves for digital hardware in general, especially for AI chips; second, the rate of growth in demand for data itself.
The past century of modern computing and communications shows that demand for data has grown far faster than engineers can improve efficiency. There’s no evidence to suggest this trend will change.
In fact, today’s information-system energy use is the result of astounding gains in computing energy-efficiency. At the energy-efficiency of computing circa 1984, a single iPhone would use as much power as a skyscraper. If that were the case, there would be no smartphones today. Instead, we have billions of them.
The same patterns hold across the entire silicon landscape, including for AI. Chip efficiencies for AI are improving at a blistering pace. Nvidia’s latest chip is 30-fold faster for the same power appetite.
That won’t save energy—it will accelerate the market’s appetite for such chips at least 100-fold. Such is the nature of information systems. And the continued and dramatic improvement in AI chip efficiencies is built into the assumptions of all the industry-insider forecasts of ballooning overall energy use for AI.
But this raises the fundamental question: Just how much demand is there for data, the “fuel” that makes AI possible? We are on the precipice of an unprecedented expansion in both the variety and scale of data yet to be created, stored, and subsequently refined into useful products and services. As a practical matter, information is an infinite resource.
If it feels as though we’ve reached a kind of apotheosis in all things digital, the truth is otherwise: we are still in the early days. As an economic resource, data are unlike natural analogues—because humanity literally creates data.
And the technological means for generating that resource are expanding in scale and precision. It’s one of those rare times when rhetorical hyperbole understates the reality.
The great explosion of data production will come from the nature and capacity to observe and measure the operation and activities of both our built environment and our natural environment, amplified by the increasing automation of all kinds of hardware and systems.
Automation requires sensors, software, and control systems that necessarily generate massive data streams. Long before we see the autonomous car, for example, the “connected” car, with all its attendant features and safety systems, is already generating massive data flows.
Similarly, we’re seeing radical advances in our capacity to sense and measure all the features of our natural environment, including our own bodies.
Scientists now collect information at astronomical scales, not only in the study of astronomy itself but also in the biological world, with new instruments that generate more data per experiment than trafficked on the entire Internet a few decades ago.
All trends face eventual saturation. But humanity is a very long way away from peak information supply. Information, in effect, is the only limitless resource.
One way to guess the future magnitude of data traffic—and derivatively the energy implications—is in the names of the numbers we’ve had to create to describe quantities of data. We count food and mineral production in millions of tons; people and their devices in billions of units; airway and highway usage in trillions of air- or road-miles;
electricity and natural gas in trillions of kilowatt-hours or cubic feet; and our economies in trillions of dollars. But, at a rate of a trillion per year of anything, it takes a billion years to total one “zetta”—i.e., the name of the number that describes the scale of today’s digital traffic.
The numerical prefixes created to describe huge quantities track the progress of society’s technologies and needs. The “kilo” prefix dates back to 1795. The “mega” prefix was coined in 1873, to name 1,000 kilos.
The “giga” prefix for 1 billion (1,000 million) and “tera” (a trillion, or 1,000 billion) were both adopted in 1960. In 1975, we saw the official creation of the prefixes “peta” (1,000 giga) and “exa” (1,000 peta), and then the “zetta” (1,000 exa) in 1991. Today’s cloud traffic is estimated to be roughly 50 zettabytes a year.
It’s impossible to visualize such a number without context. A zetta-stack of dollar bills would reach from the earth to the sun (93 million miles away) and back—700,000 times.
All the molecules that comprise the Earth’s atmosphere weigh about five zettagrams. Even if each byte entails an infinitesimal amount of energy, the sheer volume of zettabyte-scale operations leads to consequential energy use.
Until just over a year ago, there was only one remaining official prefix name for a number bigger than a zetta: the 1,000 times bigger “yotta.” Given the AI-accelerated pace of data expansion, we’ll soon be in the yottabyte era.
So now the bureaucrats in the Paris-based International Bureau of Weights and Measurements have officially given names to even bigger numbers, because before long, data traffic will blow past the yottabyte scale. One thousand yottabytes? That’s a ronnabyte. Your children will be using such numbers.
Such astonishing volumes of data being processed and moved will overwhelm the gains in energy efficiency that engineers will inevitably achieve.
Already today, more capital is spent globally on expanding the energy-consuming cloud each year than all the world’s electric utilities combined spend to produce more electricity.
Credit Andreessen Horowitz’s “Techno-Optimist Manifesto” for observing that “energy is the foundational engine of our civilization. The more energy we have, the more people we can have, and the better everyone’s lives can be.”
Our cloud-centric and AI-infused twenty-first-century infrastructure illustrates this fundamental point. The world will need all forms of energy production imaginable. An “energy transition” would only restrict energy supplies—and that’s not going to happen. The good news is that the U.S. does have the technical and resource capacity to supply the energy needed. The only question is whether we have the political will to allow the proverbial “all of the above” energy solutions to happen.
See more here City Journal
Please Donate Below To Support Our Ongoing Work To Defend The Scientific Method
PRINCIPIA SCIENTIFIC INTERNATIONAL, legally registered in the UK as a company incorporated for charitable purposes. Head Office: 27 Old Gloucester Street, London WC1N 3AX.
Trackback from your site.
Wisenox
| #
Two aspects are readily apparent:
1) they want you on renewable, but they will be using hard electric.
2) if you control carbon, you control energy and manufacturing, hence climate change lies.
Also, rhetorically, why isn’t the petrodollar expiring a story anywhere?
Reply
Howdy
| #
Allready posted earlier. I see all the URLs of the duplicates end in “-2”
“This is especially true for the blazing-fast and power-hungry chips that make AI possible.”
You don’t need fast chips for AI, It runs on vintage central processing units, but graphic processors leave CPUs in the dust for certain operations, and have massive computation ability. They can be used to process data extremely quickly in medical scenarios etc, where they are used.
The primary use, graphic displays, are a hog of power resources in a computer system due to the data needing to be shifted quickly. Add in the capability to use them in multiple, and the consumption goes up enormously.
It wasn’t AI that birthed them, It deserves no credit, nor is it the only use.
Reply
Howdy
| #
http://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-pascal-architectures-nvlink-to-enable-8-way-multi-gpu-capability/
Reply