Artificial Intelligence could restore competition in the us economy

Artificial Intelligence is emerging as a threat to entrenched business models, but it may also revive competition in an economy that has grown increasingly concentrated. Lower barriers to entry and heavier capital investment could boost productivity, wages, and long-term growth if policymakers resist consolidation.

Artificial Intelligence is unsettling a growing list of industries, from software to wealth management, insurance brokerage, and property services. While markets have focused on the risks to incumbent companies, the broader economic effect could be more constructive. Artificial Intelligence has the potential to restore competitive pressure and revive innovation across sectors that have become increasingly concentrated.

Competition in the US has been declining for the past 40 years. According to the US Federal Reserve, the share of employment accounted for by newly formed firms fell 43% between 1980 and 2016. Since the late 1990s, more than three-quarters of US industries have become more concentrated. Only health care and utilities have avoided increases in top-four market share, with many industries experiencing double-digit gains. Low real interest rates in the early 2000s and 2010s, permissive antitrust standards since the 1980s, and advances in computing all helped strengthen incumbent firms and support consolidation.

That concentration has benefited equities by making cash flows more predictable and supporting higher margins. But stronger profitability has not necessarily translated into stronger productivity. Firms with the biggest increases in market concentration have posted higher margins without matching gains in operational efficiency, suggesting that pricing power, rather than better production, drove much of the improvement. Productivity is often negatively correlated with profitability and stock performance, yet it remains the foundation of long-term real wage growth and rising living standards.

Unfortunately, productivity growth since 2010 has been roughly one-third lower than in the prior three decades. Artificial Intelligence could change that by lowering barriers to entry, reducing the advantages of scale in knowledge work, and helping smaller firms compete more effectively. At the same time, hyperscalers are expanding beyond search, social media, and cloud computing, investing heavily in data centers and model development.

Capital investment is the foundation of productivity growth. One estimate suggests hyperscalers could spend roughly 2.1% of GDP on capital expenditures in 2026. That spending could drive economic expansion, but it may also produce the familiar side effects of major infrastructure booms, including excess capacity, volatility, and efforts by dominant firms to secure protection through government support or mergers.

The long-term outcome will depend on whether productivity gains come from producing more with the same workforce or producing the same output with fewer workers. Preserving competition through support for new entrants and resistance to bailouts and excessive consolidation may pressure margins and valuations. But it could also restore dynamism, lift productivity, raise real wages, and strengthen long-term economic growth.

58

Impact Score

Arm moves into chip production with new data center cpu

Arm is moving beyond licensing and into chip production with a new data center processor aimed at Artificial Intelligence workloads. Meta Platforms will be the lead partner as Arm targets a much larger revenue opportunity in data center infrastructure.

LiteLLM breach exposes Artificial Intelligence supply chain risks

A malware infection in LiteLLM, a widely used open-source Artificial Intelligence gateway, has raised concerns about credential theft and the security of enterprise Artificial Intelligence dependencies. The incident also puts pressure on third-party compliance checks after Delve had certified the project.

TurboQuant targets large language model compression

Google’s TurboQuant is presented as a compression approach for large language models and vector search engines that aims to cut memory use while preserving accuracy. The system combines new quantization methods to make models faster, cheaper, and easier to deploy at larger scale.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.