OpenAI debuts GPT-5.4 with native computer control

OpenAI’s GPT-5.4 introduces native computer control to move beyond chat, while Lightricks’ LTX-2.3 brings local Artificial Intelligence video generation and Anthropic rolls out a job impact tracker.

OpenAI released GPT-5.4, a model positioned to move Artificial Intelligence beyond text chat by adding native computer control and stronger reasoning capabilities. The system can issue keyboard and mouse commands, interact with applications from screenshots, and perform multi-step tasks across a user’s device, turning it into a digital operator rather than a passive assistant. OpenAI highlights improved reasoning that helps GPT-5.4 search across multiple sources and synthesize answers more reliably, combined with enhanced coding and professional productivity skills aimed at real work scenarios.

The launch emphasizes factual accuracy and transparency. GPT-5.4 delivers 33% fewer false claims compared to GPT-5.2, which makes it OpenAI’s most factual model so far, and introduces a GPT-5.4 Thinking mode in ChatGPT that outlines its reasoning as it responds. Users can adjust their requests mid-response, potentially tightening control over how the model interprets and executes tasks. By pairing these reasoning upgrades with native computer use, GPT-5.4 is framed as a significant step toward autonomous Artificial Intelligence agents that can reliably operate software and complete workflows on users’ behalf.

Alongside OpenAI’s update, Lightricks introduced LTX-2.3, a production-grade Artificial Intelligence video model with built-in audio generation that runs entirely on consumer hardware, including NVIDIA RTX 30/40/50 GPUs with as low as 8GB VRAM and MacBooks. LTX-2.3 generates up to 4K video at 50 FPS, supports clips up to 20 seconds, and offers native portrait formats for TikTok and Reels, with two modes called Fast and Pro for iteration and final renders. The model ships with fully open weights and free commercial use for companies under $10M revenue, giving creators local control over footage, eliminating ongoing cloud compute fees, and enabling studios to fine-tune on proprietary content. On the labor side, Anthropic launched an Artificial Intelligence Job Impact Tracker that measures how automation is affecting different professions by combining task automation potential with real large language model usage, finding computer programming (75%), customer service (70%), and data entry (67%) as highly exposed, while many physical and service roles show near-zero exposure, and noting that early data shows no clear rise in unemployment yet even as some entry-level hiring may be slowing.

72

Impact Score

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.