Researchers build low power artificial intelligence that learns like the human brain

A research team has developed a brain inspired artificial intelligence system that learns continuously, reacts only to meaningful changes, and runs on specialized low power chips, bringing advanced machine learning to edge devices. The approach challenges energy hungry data center models and points toward a more ecological, widely accessible future for artificial intelligence.

Researchers from a cross disciplinary group of neuroscientists, chip designers, and machine learning engineers have created an artificial intelligence system that mimics key principles of the human brain to dramatically cut power use. Instead of relying on massive data centers and giant datasets, the system learns continuously from real world events, updating itself in real time while running on a small board that uses less power than a laptop. The work is driven by the stark contrast between current artificial intelligence models that can consume as much electricity during training as a small town uses in weeks or even months and the human brain, which operates complex perception, memory, and prediction on about 20 watts of power, roughly the same as a dim light bulb.

The team’s breakthrough centers on sparsity and event driven computation. In the human brain most neurons stay quiet most of the time, and activity is localized and brief, like a city at night with only a few windows lit. Conventional artificial intelligence turns on large portions of a network for every task, which is fast but energy intensive. The new models activate only small, relevant parts at any moment and pair this with sensors and algorithms that respond mainly to changes, much like eyes sending motion and edge information instead of full images dozens of times per second. In tests such as gesture recognition from video, the standard model processed every frame and produced a power profile like a jagged mountain range, while the brain inspired system produced a nearly flat power line with tiny spikes only when a hand moved, yet achieved similar accuracy for a fraction of the energy.

Delivering this behavior required reworking both learning algorithms and hardware. The system moves away from exclusive dependence on backpropagation over giant batches and instead learns incrementally, updating connections after each relevant event, which suits devices that live at the network edge. It emphasizes local learning, where nearby units adjust based on each other’s activity, and incorporates mechanisms to forget gracefully, akin to a form of artificial sleep that prioritizes and compresses memories. On the hardware side, the design pushes memory next to processing elements in neuromorphic chips, reducing the energy wasted shuttling data. A comparison table highlights how typical artificial intelligence today is trained in large batches on huge datasets with dense activity and high energy use in cloud servers, while the brain inspired low power artificial intelligence learns continuously from streaming events with sparse activation on small, efficient chips at the edge.

The implications extend beyond engineering efficiency to questions of access, climate, and everyday experience. When each new model costs millions to train and demands industrial scale hardware, advances concentrate in large companies and wealthy institutions, and many regions become consumers rather than creators. Low power, brain like artificial intelligence makes it plausible for solar powered sensors in remote fields, battery devices in villages off the grid, and localized tools in clinics or farms to host meaningful intelligence, such as a weather station that learns local wind patterns or a water pump that predicts failure. As artificial intelligence spreads, its total energy use and associated emissions pose a growing trade off, and every reduction in the power appetite becomes a way to align digital systems with the efficiency of living ecosystems instead of the brute force of factories.

Envisioned applications include traffic cameras that wake only when movement matters instead of streaming every second of video, crosswalks that learn local patterns like school routes while running on small solar panels, vineyards where sensors learn the interplay of fog and temperature to fine tune irrigation, and forest listening posts that wait quietly for anomalies such as early fire sounds or illegal chainsaws. In homes, such artificial intelligence could distinguish between similar sounding voice commands or between a cough and a dropped plate directly on device without sending audio to distant servers, preserving privacy and barely shifting the energy bill. The research resonates with a broader idea that intelligence grows out of constraints, as brains evolved to conserve scarce calories and attend only when necessary, and it suggests that by copying nature’s frugality, machines can become less intrusive and more compatible with the rest of life.

Researchers acknowledge that this advance will not replace large data center models, which still excel at tasks requiring broad knowledge and heavy computation, and that current brain inspired systems face trade offs in precision, complexity, and flexibility. Yet the trajectory has shifted toward asking how much can be done with less, rather than assuming that more data and more power are always the answer. In the near future, the impact may appear in devices that last longer on a charge, tools that keep adapting without a Wi Fi signal, and sensors that operate for months or years on small amounts of harvested energy from sunlight, vibration, or heat. Behind such objects will be descendants of the quiet board in the dim lab, using timing, selectivity, and structure to learn in real time without burning through the world that sustains them, and future teams will watch not only for accuracy but for the softness of the energy curve that marks a truly frugal intelligence.

68

Impact Score

Micron 9650 PCIe Gen 6 data center SSD enters mass production

Micron’s 9650 NVMe SSD is the first PCIe Gen 6 data center drive to reach mass production, delivering major performance and efficiency gains over PCIe Gen 5 models. The drive targets high-throughput Artificial Intelligence training and inference workloads with support for both air and liquid cooling.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.