Adobe launches LLM Optimizer to boost brand visibility in generative Artificial Intelligence era

Adobe debuts LLM Optimizer, empowering enterprises to track and improve brand engagement as Artificial Intelligence-powered browsing and chat become mainstream.

Adobe has introduced LLM Optimizer, a new enterprise application aimed at helping businesses adapt to the rapidly changing digital landscape as consumers increasingly interact with brands through generative Artificial Intelligence-powered browsers and chat services. Announced at Cannes Lions 2025, the tool integrates with Adobe Experience Cloud and promises a robust set of features, including real-time monitoring of Artificial Intelligence-driven traffic, benchmarking against peer brands, and actionable, quick-to-implement recommendations to optimize discoverability and customer engagement.

LLM Optimizer provides organizations with critical visibility into how their owned content is surfaced by large language models in user queries, enabling teams to see exactly where and how their brand appears in Artificial Intelligence-fueled experiences. The platform also includes benchmarking, allowing businesses to compare their prominence in high-value queries versus competitors and adjust their strategy accordingly. The integrated recommendation engine detects visibility gaps and suggests improvements across both proprietary and external content channels, such as web pages, FAQs, Wikipedia, and popular forums. Recommendations are tailored to align with the ranking criteria prioritized by large language models, emphasizing informative, high-quality sources.

Adobe´s announcement is underpinned by striking new data: according to Adobe Analytics, U.S. retail sites experienced a staggering 3,500% increase in traffic from generative Artificial Intelligence sources between July 2024 and May 2025. The same surge applies to travel sites, which saw a 3,200% uptick. This consumer shift highlights a dramatic evolution in how users discover and engage with brands, with conversational experiences at the forefront. LLM Optimizer is designed to seamlessly fit into existing workflows for SEO teams, content strategists, digital marketers, and web publishers, while also integrating with enterprise frameworks such as Agent-to-Agent and Model Context Protocol. Businesses can deploy the tool as a standalone solution or leverage its native integration with Adobe Experience Manager Sites. With LLM Optimizer, Adobe aims to enable organizations to not only track but actively shape their presence in an Artificial Intelligence-driven digital economy, ensuring they capture attention and drive results as generative experiences proliferate.

69

Impact Score

LLM-PIEval: a benchmark for indirect prompt injection attacks in large language models

Large language models have increased interest in Artificial Intelligence and their integration with external tools introduces risks such as direct and indirect prompt injection. LLM-PIEval provides a framework and test set to measure indirect prompt injection risk and the authors release API specifications and prompts to support wider assessment.

NVIDIA may stop bundling memory with gpu kits amid gddr shortage

NVIDIA is reportedly considering supplying only bare silicon to its aic partners rather than the usual gpu and memory kit as gddr shortages constrain fulfillment. The move follows wider industry pressure from soaring dram prices and an impending price increase from AMD of about 10% across its gpu lineup.

SK Hynix to showcase 48 Gb/s 24 Gb GDDR7 for Artificial Intelligence inference

SK Hynix will present a 24 Gb GDDR7 chip rated for 48 Gb/s at ISSCC 2026, claiming a symmetric dual-channel design and updated internal interfaces that push past the expected 32 to 37 Gb/s. The paper positions the device for mid-range Artificial Intelligence inference and SK Hynix will also show LPDDR6 running at 14.4 Gb/s.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.