Ibm expands enterprise Artificial Intelligence for hybrid cloud and mainframes

Ibm introduced new enterprise Artificial Intelligence products centered on hybrid infrastructure, mainframes and agent orchestration. The updates reinforce its multi-model strategy while targeting regulated and on-premises environments where business data remains entrenched.

Ibm unveiled a new set of enterprise Artificial Intelligence platforms and updates at its Think 2026 conference in Boston, focusing on the company’s traditional strengths in large organizations running mainframe-based and hybrid systems. The announcements included the IBM Bob development package for mainframes, enhancements to the Watsonx Orchestrate platform for Artificial Intelligence agents, and the general availability of IBM Sovereign Core. The releases underscore the company’s continued emphasis on enterprise customers rather than competing head-on with hyperscalers and frontier model developers on model scale alone.

Arvind Krishna said, “But you have to act where the data is, and over 70% of all the data is still sitting inside the enterprise in systems that are core and germane to them. And so, we have to couple what we do there with hybrid cloud.” Rob Thomas positioned Watsonx Orchestrate as a leading agent platform and said IBM will keep adding tools for customers to build their own agents. He also described IBM Bob, a software development lifecycle suite using Artificial Intelligence, as “the first that’s designed for multi-model and cloud and on-premises deployment.”

Analysts said the strategy plays to IBM’s advantage in regulated and sovereign deployments. IDC analyst Jim Mercer said Concert, which focuses on Artificial Intelligence observability by aggregating signals from applications, infrastructure, network and cost into a unified view, competes with AWS DevOps Agent. He said a differentiator is that Concert can work with IBM Sovereign Core and is designed for regulated, hybrid and sovereign environments where data must remain on premises or within jurisdictional boundaries. IBM also integrated Watsonx with Confluent’s streaming data platform after completing its ? billion acquisition of Confluent in March, a move analysts said could improve how agents access business data across complex environments.

Lopez Research founder Maribel Lopez said access to the right business data remains a major obstacle for agentic systems and said the Watsonx.data and Confluent combination should help address part of that challenge. She also pointed to the need to manage large numbers of agents across hybrid environments, arguing that orchestration alone is insufficient without governance spanning multiple cloud platforms. IBM is also facing pressure around COBOL modernization after Anthropic released an agent in February targeting legacy code. Thomas noted that announcement triggered a 13% drop in IBM’s share price, but he argued that the real value lies in understanding the business logic embedded in COBOL, not just rewriting code. Krishna said broader tooling that helps enterprises modernize decades-old COBOL systems would ultimately benefit IBM and its clients.

52

Impact Score

Genesis mission ties Artificial Intelligence progress to U.S. energy buildout

U.S. Energy Secretary Chris Wright and NVIDIA’s Ian Buck framed energy capacity and computing infrastructure as twin requirements for American leadership in Artificial Intelligence. The Department of Energy’s Genesis Mission is positioned as the practical effort to apply Artificial Intelligence to science, grid modernization, and fusion research.

Artificial Intelligence diffusion lags frontier gains

Rapid advances in Artificial Intelligence capability are not translating automatically into broad productivity growth or equitable gains. Diffusion remains uneven across firms, sectors, countries, and workers, pushing policymakers to focus on skills, governance, procurement, and measurement.

Self-adaptive framework extracts earthquake data from web pages

A self-adaptive large language model framework is designed to extract and structure earthquake information from heterogeneous web sources by generating, validating, and reusing extraction schemas. In controlled tests, GPT_OSS delivered the strongest extraction quality, while selector errors were concentrated in wrong element selection and missing content.

Study finds widespread weaknesses in autonomous agents

A multi-institution study found that autonomous agents across several sectors are highly exposed to tool-chaining, goal drift, and memory poisoning attacks. The findings suggest agentic systems face broader and deeper security risks than stateless large language models.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.