The state of Artificial Intelligence in learning systems

Artificial Intelligence remains one of the most heavily promoted trends in learning systems, but practical deployment is still uneven. The strongest traction is in content creation, translation, and localisation, while broader adoption is constrained by caution, weak differentiation, and changing market boundaries.

Published in February 2026, the Fosway 9-Grid™ for Learning Systems looks at the learning management solutions market for EMEA’s learning leaders, with a close focus on the state of Artificial Intelligence in learning systems. Artificial Intelligence continues to be one of the most hyped trends in learning systems, but vendor delivery remains patchy and uneven. Many suppliers market themselves as native Artificial Intelligence platforms that can transform learning, yet there is still a wide gap between promotional claims, live customer deployments, and the features customers actually use. In some regions and industries, the most requested Artificial Intelligence feature is the ability to switch Artificial Intelligence off, highlighting continued caution among corporate buyers.

A common set of use cases for Generative Artificial Intelligence relates to Artificial Intelligence content creation, translation, and localisation. L&D teams are finding these capabilities genuinely useful, particularly for reducing repetitive tasks and speeding up course development. Other uses can be harder to sell even when the potential value is high. Using Artificial Intelligence to reduce days of learning administration work down to minutes can create anxiety among learning staff concerned about job displacement. User experience is another barrier. Most Artificial Intelligence assistants look and behave similarly, with limited differentiation and often uninspiring interfaces. Although the technology is maturing quickly, the gap between promise and practical, day-to-day usage remains substantial. Vendors and buyers need to focus on meaningful use cases and better utility if Artificial Intelligence is to deliver real value.

The market is also being reshaped by convergence between learning systems, authoring tools, and content. Learning system vendors are expanding beyond core platform functions, with content creation and provision becoming a strategic priority. Authoring is now widely treated as a baseline expectation, and vendors are adding these capabilities either through acquisition or by building them directly into their suites. New buyers increasingly expect to populate a learning system quickly and cheaply, including the ability to create and adapt their own content rather than only distribute it.

That shift is putting pressure on adjacent providers. Content library companies are strengthening their own system capabilities and repositioning themselves as platform providers instead of pure content businesses. The long-term defensibility of a standalone content catalogue is weakening, especially as Artificial Intelligence speeds up content generation and personalisation. Some learning system vendors are also using Artificial Intelligence to build proprietary content libraries quickly, allowing them to undercut traditional content providers and reduce dependence on third-party catalogues. As a result, the boundaries between platforms, authoring, and content have become increasingly blurred.

52

Impact Score

AMD plans specialized EPYC CPUs for Artificial Intelligence, hpc, and cloud

AMD is preparing a broader EPYC strategy with task-specific server CPUs aimed at agentic Artificial Intelligence, hpc, training and inference, and cloud deployments. The shift starts with the Zen 6 generation and adds Verano as an Artificial Intelligence-focused variant within the same EPYC family.

Nvidia expands spectrum-x ethernet with open mrc protocol

Nvidia is positioning Spectrum-X Ethernet as a foundation for large-scale Artificial Intelligence training, with Multipath Reliable Connection adding open, multi-path RDMA transport for higher resilience and throughput. OpenAI, Microsoft and Oracle are among the organizations using the technology in large Artificial Intelligence environments.

Anthropic explores Fractile chips to diversify supply

Anthropic is reportedly in early talks with London-based Fractile to secure high-performance Artificial Intelligence chips for inference workloads. The move would reduce reliance on Nvidia and broaden the company’s hardware supply chain.

OpenAI curbs odd creature references in chatbot responses

OpenAI has adjusted its models after users complained about overly familiar responses and strange references to goblins, gremlins, pigeons, and raccoons. The company traced the behavior to a retired “nerdy” personality whose habits spread into broader model training.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.