The one thing that stands out about predicting the future is how often we are totally wrong. And that’s especially true when we try to predict the big changes in everything from religion to climate. Despite the predictions of 19th century economist Thomas Malthus that exponential growth in population would cause future generations to starve, nothing of the
kind happened. More recent fears that genetically modified foods would lead to catastrophe, or that oil production could only go down, have thus far proven wrong as well.
So why does anyone really think that the future of the economy and society in a new AI world where computers can be taught to a, converse, and even create can be predicted with any more success?
In truth — and humility — it must be said that knowing how it will all play out is truly impossible. Especially for this simple economic forecaster. Despite this, the time to be thinking about the implications of the continued rapid development of artificial intelligence for labor markets, businesses, and the entire economy has clearly arrived.
Why all the attention to AI
IBM famously spent 11 years and billions of dollars to create and refine a computer to defeat a chess master in 1996. In 2017, AI-powered computers were beating the best humans at every game imaginable, and doing it by training themselves. And the breakneck speed of improvement has continued to this day.
At its heart, the essence of AI is the building, refining, and then using predictive models of staggering size. At least they seem to be staggeringly large to this economist. Our old-fashioned sense of what is a large model might
be one where we use, say, 30 different factors to predict something. AI models use hundreds of billions of parameters that are estimated with similarly huge amounts of data, searching for patterns to be used in making predictions. Such models employ machine learning, where tentative predictions are compared to actual outcomes, and adapt and improve on their own without receiving explicit programming instructions.
It has been the ability of these models to generate content — known as generative AI — that has captured attention in recent years. That content could be text, analysis, spoken words, graphic output, or machine instructions. The tasks that these models can take on is truly remarkable, and they are already reshaping how knowledge work is performed in a broad swath of industries and activities.
Early adopters have included:
Customer relations: responding to complex inquiries from customers, providing real-time assistance to sales agents during telephone calls, using data on customer history to refine interactions and conduct post-conversation evaluations.
Research and development: exploratory research, idea generation, virtual design and simulation, physical test planning.
Software engineering: inception and planning, system design, coding, testing.
The breadth of industries where these innovations are already taking hold is wide. Banking, real estate, and education are immediate examples. These and other industries aren’t in Silicon Valley or on the east coast. They are everywhere — including cities and towns in Montana.
AI and economic growth
This has captured the attention of economists. A very old concept in economics is what is known as the production possibilities frontier. It captures the notion of how resources are used to produce things, and that when resources are used efficiently — we are on the frontier — that the only way to produce more of one thing necessarily involves producing less of something else.
Using this jargon, modern, industrialized economies like the United States are “frontier” economies. We cannot raise standards of living for everyone unless we can figure out how to produce more of everything. Economists measure this ability, known as productivity, and have tracked it over the years. As shown in Figure 1, its growth has varied considerably since the end of World War II for the national economy.
Taking averages over each of the
seven and a half decades since the 1940s, productivity growth has roughly varied between 1.5% and 3%. Growth was strongest immediately after the war, as production innovations used to prevail in that conflict flowed back into the peacetime economy. There was also a notable boost in the 2000s, which some have attributed to the growth of globalization and the digital economy.
These are much lower growth rates than countries like China, Korea, or Singapore have experienced. But the
so-called “Asian tiger” economies are not frontier economies, and they have achieved these faster “make up” growth rates by eliminating waste and bottlenecks that held back production.
What are the implications of the widespread adoption of AI for these trends? Many see AI as something that could boost productivity growth for decades to come. And since the gains compound over time, an increase in productivity growth by, say, 2 percentage points could make our economy 50% wealthier in just over 20 years.
These kinds of projections underscore why some have referred to AI as the next industrial revolution.

Short- and Longer-Term Challenges
Optimistic predictions could flounder, however, unless bottlenecks to producing what is needed to enable growth in AI capacity are effectively addressed. Two significant constraints are computer-chip capacity and electric power generation and delivery.
We got a painful lesson in computer chip-making constraints in the aftermath of the pandemic, which the quick snap-back in motor vehicle demand produced a surge in demand for chips that the global ndustry could not immediately satisfy. The images of acres of land covered with row after row of newly assembled vehicles awaiting chips to become operational are hard to forget.
That story involved much less sophisticated chips than are involved with those needed for AI-powered data processing. Data center growth to support AI will involve heavy demand for Graphics Processing Unit (GPU) and high-capacity memory chips that are already in short supply. New demand for AI-type capabilities in traditional devices like laptops, tablets, and phones are another source of new demand.
An increase in productivity growth by, say, 2 percentage points could make our economy 50% wealthier in just over 20 years.

Bain & Company, a global consulting company, notes that the supply chains for semi-conductor production are fragile and have difficulty responding to fluctuations in demand of the magnitude being projected. At some stages of production, capacity increases of 200% or more may be necessary.
A second immediate challenge comes from the power needs of an AI-fueled data processing expansion. Data centers already figure prominently in end uses of electricity, serving everything from crytocurrency to e-commerce activities. Power demand for AI-computing tasks will be especially significant, given the size of the datasets involved and the complexity of the calcula-tions. A power grid that already struggles with adequacy given the pressures to re-duce carbon emissions from its generating resource fleet faces a new challenge with the rollout of AI.
McKinsey & Company, a global management consulting firm, predicts that the power needs of data centers will account for more than 11% of all electricity consumed by the year 2030, compared to a 3% to 4% share today. Not all of this new demand is AI related, but the rapid scaling of AI to serve wider applications is the primary driver of the change. Given the long lead time needed to build up supply capacity, not to mention the political pressures that can force delays and cancel-lations of power projects in recent history, this is a daunting challenge.
Opportunities for Montana?
It is hard to know what this “next in-dustrial revolution” powered by AI means for Montana’s economy. But we can at least start to think about its impacts unfolding in two broad ways.
First, we can consider the implications of AI for the existing industries that comprise our economic drivers today. In a competitive environment, adoption of AI is not a choice for most businesses. And it could be a boon, especially if the technolo-gy brings specialized knowledge within the reach of businesses here that currently un-able to access them. You might need to be in San Francisco, for example, to get access to a patent attorney who speaks Mandarin today. But will that be true in the future?
A second dimension of change for Montana would be to consider what entirely new business activities could take root here that were not feasible or eco-nomic before. Recent migration changes have made it clear that mountain west states like Montana have powerful appeal as places to live. It is unknowable, but a technological change that is fundamental-ly aimed at the production of knowledge could potentially change the equation for business location decisions that would diversify our economic base.
Patrick M. Barkey is research director at the Bureau of Business and Economic Research at the University of Montana.