Neuro-symbolic models cut energy use and improve robot task performance

Researchers at Tufts University report that neuro-symbolic Artificial Intelligence systems could sharply reduce power consumption while improving accuracy in robot control tasks. The approach combines neural network learning with symbolic reasoning to limit trial and error and improve reliability.

Researchers at Tufts University have developed a proof-of-concept Artificial Intelligence system designed to reduce the heavy power demands of current data center and model training approaches. The work focuses on neuro-symbolic Artificial Intelligence, which combines conventional neural network pattern recognition with symbolic reasoning that breaks tasks into rules, categories, and structured steps. The team argues that this hybrid approach can improve both efficiency and reliability for robotics applications.

The research centers on visual-language-action models used to control robots rather than screen-based large language models. These systems take camera and language inputs and translate them into real-world actions such as moving wheels, arms, legs, or fingers. Standard visual-language-action systems rely on statistical patterns from large training sets, which can lead to mistakes when interpreting physical settings. In tasks like stacking blocks, errors can include misreading shapes because of shadows, placing blocks incorrectly, or building unstable structures. Symbolic reasoning is intended to reduce those failures by applying generalized rules about objects, task structure, and physical constraints.

In tests using a standard Tower of Hanoi puzzle, the neuro-symbolic VLA system had a 95% success rate, compared with 34% for standard VLAs. For a more complex version of the puzzle that the robot had not seen in training, the neuro-symbolic system had a 78% success rate, while standard VLAs failed every attempt. The neuro-symbolic system could be trained in just 34 minutes, while the standard VLA model took over a day and a half. Significantly, training the neuro-symbolic model used only 1% of the energy required to train a VLA model, and the energy savings continued during execution of tasks with the neuro-symbolic model using only 5% of the energy required for running the VLA.

The findings come as power use from Artificial Intelligence infrastructure continues to rise. The International Energy Agency estimates U.S. Artificial Intelligence and data centers used about 415 terrawatt hours of power in 2024, more than 10% of that year’s nationwide energy output, and that figure is expected to double by 2030. Matthias Scheutz said current predictive systems can be both error-prone and energy-intensive, noting that an Artificial Intelligence summary on Google can consume up to 100 times more energy than generating standard website listings. The researchers conclude that hybrid neuro-symbolic Artificial Intelligence may offer a more sustainable and dependable foundation than current large language models and visual-language-action systems.

55

Impact Score

Intel bets on GPUs to extend its turnaround

Intel is trying to build on early signs of recovery by pushing deeper into GPUs for Artificial Intelligence data centers and consumer devices. The strategy hinges on sustained demand, sharper execution, and a broader effort to restore confidence in the company.

Sony signals frame generation for PlayStation

Sony indicates that frame generation is planned for PlayStation platforms, extending the company’s push into machine learning and Artificial Intelligence-based graphics features. The timing remains unclear, and no release is planned for 2026.

Sports attention economy projected to reach 600 billion by 2030

Athletes are becoming powerful direct-to-fan media businesses as capital shifts toward the infrastructure that captures audience data, engagement, and commerce. Investors are increasingly backing fan platforms, athlete brand tools, fintech, and Artificial Intelligence systems rather than traditional league-controlled channels.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.