‘Thermodynamic computer’ uses thermal noise to mimic generative artificial intelligence

Researchers have demonstrated a thermodynamic computer concept that harnesses thermal noise to generate images, closely mimicking neural network diffusion models while consuming orders of magnitude less energy than conventional artificial intelligence hardware.

Researchers have created a conceptual “generative thermodynamic computer” that produces images from random noise by exploiting thermal fluctuations instead of suppressing them, achieving capabilities similar to generative artificial intelligence neural networks with orders of magnitude less energy use. Above absolute zero, atoms, molecules and magnetic moments constantly jiggle in response to thermal noise, but conventional computer chips operate at energies where the cost of flipping bits dwarfs these fluctuations, effectively rendering the noise irrelevant. By contrast, thermodynamic computing treats this noise as a computational resource, aligning it with the probabilistic nature of many modern learning algorithms that already inject noise during neural network training.

The work builds on a broader shift toward probabilistic computing, in which circuits represent probabilities instead of fixed binary values, increasing efficiency for optimization problems where the goal is to extract maximal output from minimal input. Prior research at Normal Computing Corporation showed that a network of low-energy circuits, coupled through programmable connections at energies comparable to thermal noise, could be tuned so that their equilibrium fluctuations solved linear algebra problems. Coupling strengths effectively encode a question that the system’s equilibrium behavior answers, although this equilibrium-based scheme limits the types of questions that can be addressed and can be slow to settle.

Seeking more flexible behavior, Stephen Whitelam at Lawrence Berkeley National Laboratory explored driving such systems out of equilibrium, drawing inspiration from diffusion models developed in the mid-2010s. Those models take an image, add noise until it disappears, then train a neural network to reverse the process and reconstruct or generate images from pure noise. Using the Langevin equation from 1908, which describes how systems evolve under strong noise, Whitelam calculated the probabilities of each pixel flipping as an image is corrupted and then derived coupling strengths that reverse the process step by step. Numerical simulations on a tiny training set containing the numerals “0,” “1” and “2” showed that the thermodynamic scheme could retrieve learned digits from noise and even generate new variants not present in the original dataset.

Industry observers see the approach as an important addition to the growing family of probabilistic computing paradigms addressing the rising energy demands of conventional artificial intelligence. Ramy Shelbaya, CEO of Quantum Dice, which develops quantum-based probabilistic hardware, highlighted both the potential energy savings and the way physics-based models can make learning processes more transparent than traditional “black-box” systems. Although generating three digits from noise is modest compared with state-of-the-art generative artificial intelligence, thermodynamic computing as a concept is only a few years old, and Whitelam is now focused on whether such hardware could be scaled up in a similar fashion to the rapid historical growth of machine learning.

52

Impact Score

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.