Economists are drawing parallels between today’s Artificial Intelligence boom and the productivity paradox identified during the Information Age. In 1987, Robert Solow observed that computing had become widespread without showing up clearly in productivity statistics, even after productivity growth slowed from 2.9% from 1948 to 1973, to 1.1% after 1973. Similar doubts are now emerging as companies promote Artificial Intelligence adoption while broad economic measures remain subdued.
Recent executive survey data underscores that gap. A study published in February by the National Bureau of Economic Research found that among 6,000 CEOs, chief financial officers, and other executives from firms who responded to various business outlook surveys in the U.S., U.K., Germany, and Australia, the vast majority see little impact from Artificial Intelligence on their operations. While about two-thirds of executives reported using Artificial Intelligence, that usage amounted to only about 1.5 hours per week, and 25% of respondents reported not using Artificial Intelligence in the workplace at all. Nearly 90% of firms said Artificial Intelligence has had no impact on employment or productivity over the past three years. Even so, executives forecast Artificial Intelligence will increase productivity by 1.4% and increase output by 0.8% over the next three years, while firms expected a 0.7% cut to employment and individual employees surveyed saw a 0.5% increase in employment.
That muted picture contrasts with some earlier and more optimistic claims. In 2023, MIT researchers said Artificial Intelligence implementation could increase a worker’s performance by nearly 40% compared to workers who did not use the technology. But economists have questioned when corporate spending, which swelled to more than 250 billion in 2024, will generate clear returns. Torsten Slok of Apollo said Artificial Intelligence is still not visible in employment, productivity, or inflation data, and argued that outside the Magnificent Seven there are no signs of it in profit margins or earnings expectations. Academic findings remain mixed: the Federal Reserve Bank of St. Louis reported a 1.9% increase in excess cumulative productivity growth since the late-2022 introduction of ChatGPT, while a 2024 MIT study projected a 0.5% increase in productivity over the next decade.
Emerging research suggests uneven adoption may help explain the disconnect. ManpowerGroup’s 2026 Global Talent Barometer found that across nearly 14,000 workers in 19 countries, workers’ regular Artificial Intelligence use increased 13% in 2025, but confidence in the technology’s utility fell 18%. A Boston Consulting Group survey of 1,488 full-time U.S.-based workers found that self-reported productivity improved when workers used three or fewer Artificial Intelligence tools, but dropped when they used four or more, with respondents reporting brain fog and more small mistakes. IBM’s chief human resources officer said the company would triple its number of young hires, arguing that automating too many entry-level tasks could weaken the future leadership pipeline.
Some economists think the slowdown may still give way to stronger gains, much as the computing boom eventually did. Productivity growth later improved by 1.5% from 1995 to 2005 following years of weakness. Erik Brynjolfsson said fourth-quarter GDP was tracking up 3.7%, despite a jobs report revising down job gains to just 181,000, and said his analysis showed a U.S. productivity jump of 2.7% last year. Other research found generative Artificial Intelligence improved the efficiency of online tasks for 200,000 U.S. households by between 76% and 176%, although much of the saved time went to leisure rather than work or skills development. Slok said the long-term effect may follow a J-curve, with early disruption followed by stronger gains if companies learn how to implement the technology effectively across the economy.
