A trillion dollars is a terrible thing to waste

Gary Marcus argues that the machine learning mainstream's prolonged focus on scaling large language models may have cost roughly a trillion dollars and produced diminishing returns. He urges a pivot toward new ideas such as neurosymbolic techniques and built-in inductive constraints to address persistent problems.

Breaking coverage centers on a new interview with Ilya Sutskever in which he acknowledges that scaling through more chips and more data is flattening and that large language models generalize worse than people. Sutskever signals openness to neurosymbolic techniques and innateness, and he stops short of forecasting a bright future for pure large language models. Gary Marcus notes that this perspective echoes warnings he and others have made for years, citing his pre-GPT 2018 critique and subsequent writings about limits on deep learning and the Kaplan scaling laws.

Marcus frames the change as late to the mainstream and costly. He offers a back-of-the-envelope estimate of roughly a trillion dollars spent on the scaling detour, much of it on Nvidia chips and high salaries, and highlights the venture capital dynamics that encourage continued bets on scaling. Marcus calls out the structural incentives of a 2% management fee model for venture capitalists, which can make scaling an attractive, low-risk way for investors to profit even if outcomes disappoint. He and others warn of potential collateral damage if expectations for generative Artificial Intelligence do not materialize, including damaged returns for limited partners, pressure on banks and credit, and a possible macroeconomic correction if AI-related spending has been propping recent growth.

The piece synthesizes technical and economic critique: technical papers and researchers have documented persistent issues such as hallucinations, truth, generalization and reasoning despite larger models, while commentators in the economic press warn that much of the economy’s recent growth is premised on promised productivity gains from generative Artificial Intelligence. Marcus concludes that the field went all in on one path, excluding other disciplines such as cognitive science, and that the community now faces the cost of rediscovering lessons advocates of neurosymbolic and innate-constraint approaches have long argued. The central warning is practical: a trillion dollars is a terrible amount to have perhaps wasted if the core problems remain unsolved.

58

Impact Score

Best artificial intelligence video generators in 2026 for real creator workflows

Artificial Intelligence video tools are shifting from novelty to core production resources, with creators weighing consistency, control, and speed across platforms like Runway, Kling, Pika, and Seedance 2.0. The focus is moving from flashy first outputs to predictable, reference-driven workflows that fit real deadlines.

Cloud and data center spending accelerates artificial intelligence expansion

Cloud providers, chipmakers, and enterprises are escalating multi-billion dollar investments to build out artificial intelligence and cloud infrastructure across key global markets. Strategic deals and partnerships are reshaping data center footprints, sovereign cloud offerings, and access to high-performance compute.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.