Researchers at Tufts University have developed a proof-of-concept Artificial Intelligence system designed to reduce the heavy power demands of current data center and model training approaches. The work focuses on neuro-symbolic Artificial Intelligence, which combines conventional neural network pattern recognition with symbolic reasoning that breaks tasks into rules, categories, and structured steps. The team argues that this hybrid approach can improve both efficiency and reliability for robotics applications.
The research centers on visual-language-action models used to control robots rather than screen-based large language models. These systems take camera and language inputs and translate them into real-world actions such as moving wheels, arms, legs, or fingers. Standard visual-language-action systems rely on statistical patterns from large training sets, which can lead to mistakes when interpreting physical settings. In tasks like stacking blocks, errors can include misreading shapes because of shadows, placing blocks incorrectly, or building unstable structures. Symbolic reasoning is intended to reduce those failures by applying generalized rules about objects, task structure, and physical constraints.
In tests using a standard Tower of Hanoi puzzle, the neuro-symbolic VLA system had a 95% success rate, compared with 34% for standard VLAs. For a more complex version of the puzzle that the robot had not seen in training, the neuro-symbolic system had a 78% success rate, while standard VLAs failed every attempt. The neuro-symbolic system could be trained in just 34 minutes, while the standard VLA model took over a day and a half. Significantly, training the neuro-symbolic model used only 1% of the energy required to train a VLA model, and the energy savings continued during execution of tasks with the neuro-symbolic model using only 5% of the energy required for running the VLA.
The findings come as power use from Artificial Intelligence infrastructure continues to rise. The International Energy Agency estimates U.S. Artificial Intelligence and data centers used about 415 terrawatt hours of power in 2024, more than 10% of that year’s nationwide energy output, and that figure is expected to double by 2030. Matthias Scheutz said current predictive systems can be both error-prone and energy-intensive, noting that an Artificial Intelligence summary on Google can consume up to 100 times more energy than generating standard website listings. The researchers conclude that hybrid neuro-symbolic Artificial Intelligence may offer a more sustainable and dependable foundation than current large language models and visual-language-action systems.
