Many executives are missing out on the AI productivity surge, echoing the paradox of the IT era for economists.

Many executives are missing out on the AI productivity surge, echoing the paradox of the IT era for economists.
Summary
Robert Solow’s productivity paradox observed that tech advancements didn't lead to productivity gains.
Most executives report minimal impact of AI on productivity and employment in their firms.
Future productivity gains from AI may depend on effective implementation across different sectors.

Share

Bookmark

Newsletter

In the late 1980s, economist Robert Solow, a Nobel laureate, remarked on the stagnation of productivity growth despite the advancements in technology during the Information Age, such as transistors and microprocessors. He noted that, contrary to expectations that these innovations would enhance workplace efficiency, productivity actually declined—from 2.9% between 1948 and 1973 to just 1.1% after that. Instead of driving a productivity boom, the new computing technologies often overwhelmed users with excessive information, producing detailed reports and volumes of paper rather than the efficiency gains anticipated. This unexpected phenomenon became known as Solow’s productivity paradox. As he famously stated, "You can see the computer age everywhere but in the productivity statistics," in a New York Times Book Review article in 1987.

Fast forward to today, and there seems to be a resurgence of similar concerns regarding artificial intelligence (AI) within corporate America. Recent research, including an analysis by the Financial Times, reveals that while many C-suite executives are optimistic about AI's potential, that optimism is not translating into significant productivity improvements. Although 374 firms in the S&P 500 mentioned AI positively in their earnings calls, an extensive study of 6,000 executives found that a large percentage see minimal impact from AI on their workplaces. Roughly two-thirds utilize AI, but their average use amounts to just 1.5 hours per week, with one-quarter not employing AI at all in their operations. The majority of companies reported no changes in productivity or employment linked to AI over the past three years.

Yet, executives are hopeful for future advancements, projecting an anticipated productivity rise of 1.4% and an output increase of 0.8% within the next three years. Interestingly, while businesses expect a slight reduction in jobs, employees surveyed indicated an increase in employment prospects.

Research from MIT in 2023 suggested that AI could boost individual worker performance by nearly 40%, but mixed results from the data have left economists questioning when or if AI will deliver a return on investment, especially as corporate expenditures on AI surpass $250 billion in 2024. Apollo's chief economist, Torsten Slok, echoed Solow's sentiments when he stated that today, like decades ago, AI is not evident in important economic metrics like employment or productivity data. He noted that aside from a few high-profile tech companies, there are no discernible improvements in profit margins or earnings forecasts linked to AI deployment.

Diverse studies have painted conflicting pictures regarding AI’s effectiveness. Reports indicate some slight productivity increases, such as a 1.9% rise noted by the Federal Reserve Bank of St. Louis, but a more recent MIT study reported only a modest 0.5% gain over ten years. Nobel laureate Daron Acemoglu acknowledged that while an increase is positive, it pales in comparison to industry expectations.

Further complicating the discourse is a study by ManpowerGroup, which revealed that while regular AI usage among workers surged by 13% in 2025, confidence in the technology dropped by 18%, highlighting ongoing skepticism about its real benefits. Moreover, a Boston Consulting Group study suggests that using too many AI tools could lead to decreased productivity, as employees reported feeling overwhelmed and making more mistakes when juggling numerous applications.

IBM’s chief human resources officer remarked on the company's plan to increase its workforce, pointing out that while AI can automate tasks, it could also hinder the progression of future leaders by displacing entry-level positions.

Could this trend change in the future? Historical trends show that the IT advancements of the 1970s and 1980s eventually led to significant productivity booms in the 1990s and early 2000s. Erik Brynjolfsson, director of Stanford University's Digital Economy Lab, suggested that we may witness such a turnaround with current AI implementations, observing a notable GDP increase and a recent productivity uptick attributed to firms beginning to effectively leverage AI technology. This is reminiscent of past trends where job growth and GDP growth became less correlated due to automation.

Interestingly, a Stanford study noted hidden productivity gains, revealing that AI has significantly improved efficiency in everyday online tasks, although the time saved often shifts towards leisure rather than skill advancement.

Slok likened the future trajectory of AI productivity to a “J-curve,” where initial underperformance could be followed by significant improvements, depending on how effectively organizations integrate AI into their operations. Unlike previous IT innovations that had lengthy development cycles, AI tools are quickly accessible due to competitive forces, making its adoption and execution in diverse industries crucial for maximizing its economic potential. Ultimately, the value will not stem solely from the technology itself but from innovative applications across sectors.

Loading comments...