Decentralized crypto compute powers artificial intelligence as LLM adoption hits 46%

Decentralized compute networks are leveraging crypto infrastructure to supply scalable, cost-efficient Artificial Intelligence compute as LLM adoption among U.S. workers rises to 45.9%. The shift highlights crypto firms repurposing capacity for next-generation model inference.

Large Language Models, or LLMs, are rapidly entering U.S. workplaces. Recent data in the article reports 45.9% of U.S. workers now use LLMs, up from 30.1% in December 2024 and rising to 43.2% by April 2025. That heavier and broader usage is increasing demand for faster, larger-scale compute to support model inference and more complex workloads. The piece notes a chart by AlphaTarget illustrating the adoption trend and includes a quick take that the summary is Artificial Intelligence generated and newsroom reviewed.

Spheron Network is presented as one decentralized solution positioned to meet that demand. The platform offers community-driven infrastructure that distributes inference workloads across a global network, aiming to scale efficiently while lowering costs compared with traditional cloud services. Spheron emphasizes accessibility for developers and businesses, arguing that its model leverages unused resources to make Artificial Intelligence computation more affordable and broadly available.

The article also describes a broader industry pivot from crypto mining to Artificial Intelligence compute. Bitfarms is highlighted as an example of a crypto-focused firm repurposing capacity for high-performance computing and AI tasks, with CEO Ben Gagnon saying the market for AI compute is massive. Decentralized networks can create token-based incentives and tap existing power and hardware to supply compute at scale. Looking ahead, the article concludes that as LLM adoption climbs, decentralized compute could play a critical role in filling the gap between blockchain infrastructure and the future of Artificial Intelligence by offering flexible, cost-efficient alternatives to centralized cloud providers.

68

Impact Score

Intel Fab 52 outscales TSMC Arizona in advanced wafer production

Intel Fab 52 in Arizona is producing more than 40,000 wafers per month on its 18A node, outpacing TSMC’s current Arizona output on older process technologies. The facility highlights Intel’s focus on advanced manufacturing for its own products while TSMC keeps its leading nodes primarily in Taiwan.

Intel details packaging for 16 compute dies and 24 HBM5 modules

Intel Foundry has outlined an advanced packaging approach that combines Foveros 3D and EMIB-T interconnect to scale silicon beyond conventional reticle limits, targeting configurations with 16 compute dies and 24 HBM5 memory modules in one package. The design is built around upcoming 18A and 14A process nodes and aims to support current and future high bandwidth memory standards.

Four bright spots in climate news in 2025

Despite record emissions and worsening climate disasters in 2025, several developments in China’s energy transition, grid-scale batteries, Artificial Intelligence driven investment, and global warming projections offered genuine reasons for cautious optimism.

2025 cancer breakthroughs reshape treatment and detection

Oncology in 2025 is being transformed by immunotherapy, advanced screening, large-scale clinical trials, and the rapid rise of Artificial Intelligence in medicine, which together are improving survival and quality of life for cancer patients.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.