Bridging the operational artificial intelligence gap in enterprises

Enterprises are moving artificial intelligence from pilots to production, but many struggle without strong integration, governance, and operational foundations. New survey data from senior IT leaders links enterprise-wide integration platforms to broader, more autonomous artificial intelligence deployments.

Enterprises are accelerating the move from experimental artificial intelligence projects to production deployments, with many already testing agentic artificial intelligence to achieve higher levels of automation. Despite this momentum, many organizations still face an uncertain path to full operational success, as enterprise-wide adoption remains difficult. Without integrated data and systems, reliable automated workflows, and clear governance models, artificial intelligence initiatives can stall in proof-of-concept stages and fail to scale.

The growing autonomy of agentic artificial intelligence systems heightens the need for a holistic approach to connecting data, applications, and systems across the enterprise. Gartner predicts over 40% of agentic AI projects will be cancelled by 2027 due to cost, inaccuracy, and governance challenges, underscoring that the core problem often lies in missing operational foundations rather than in the models themselves. A survey of 500 senior IT leaders at mid- to large-size US companies pursuing artificial intelligence, conducted by MIT Technology Review Insights in December 2025 along with expert interviews, finds that organizations with strong integration platforms are better positioned for advanced, enterprise-wide implementations and can avoid duplication, silos, and loss of oversight as workflows become more autonomous.

The research shows that some organizations are now achieving tangible results: three in four (76%) surveyed companies have at least one department with an artificial intelligence workflow fully in production. Success is most common when artificial intelligence is applied to well-defined, automated processes, with nearly half (43%) of organizations reporting strong outcomes in such scenarios, while a quarter report success with new processes and one-third (32%) are applying artificial intelligence across various processes. However, operational ownership is fragmented, with only one in three (34%) organizations having a dedicated team to maintain artificial intelligence workflows, while 21% rely on central IT, 25% on departmental operations, and 19% distribute responsibility more broadly. Companies with enterprise-wide integration platforms are five times more likely to use diverse data sources in artificial intelligence workflows, with six in 10 (59%) using five or more data sources compared to 11% of organizations that integrate only specific workflows and 0% of those without an integration platform, and these organizations also report broader multi-departmental deployments, greater current autonomy, and higher confidence in assigning more autonomy in the future.

55

Impact Score

Adobe plans outcome-based pricing for Artificial Intelligence agents

Adobe is positioning its Artificial Intelligence agents around performance-based pricing, charging only when the software completes useful work. The approach points to a more results-oriented model for selling generative Artificial Intelligence tools to business customers.

Tech firms commit billions to Artificial Intelligence infrastructure

Amazon, OpenAI, Nvidia, Meta, Google and others are signing increasingly large cloud, chip and data center agreements as demand for Artificial Intelligence infrastructure accelerates. The latest wave of deals spans investments, compute purchases, chip supply agreements and data center buildouts.

JEDEC outlines LPDDR6 expansion for data centers

JEDEC has previewed planned updates to LPDDR6 aimed at pushing the memory standard beyond mobile devices and into selected data center and accelerated computing use cases. The roadmap includes higher-capacity packaging options, flexible metadata support, 512 GB densities, and a new SOCAMM2 module standard.

Tsmc debuts A13 process technology

Tsmc has introduced its A13 process at its 2026 North America Technology Symposium as a tighter version of A14 aimed at next-generation Artificial Intelligence, high performance computing, and mobile designs. The company positions the node as a more compact and efficient option with backward-compatible design rules for faster migration.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.