How scientists are using artificial intelligence to probe the human mind

Researchers are leveraging artificial intelligence to model and predict human cognition, shedding light on both the possibilities and limitations of brain-inspired neural networks.

Despite the common comparison between artificial neural networks and biological brains, today’s approaches to artificial intelligence involve energy-hungry models that consume enormous resources, in stark contrast to the efficiency of human learning. Both systems—human brains with their billions of neurons and large language models with simulated ones—remain enigmatic, particularly in their natural facility for producing language. Scientists see promise in pushing artificial intelligence to not just mimic but unravel the mysteries of human cognition.

Recent studies published in Nature highlight efforts to bridge psychology and artificial intelligence. At the forefront is Centaur, a large language model fine-tuned on data from 160 psychology experiments. Unlike traditional models relying on basic equations, Centaur achieved significant improvements in predicting human behavior across tasks like memory recall and decision-making. Beyond prediction, researchers hope that unpacking how Centaur arrives at its choices will reveal mechanisms of thought, potentially guiding the development of new psychological theories. However, some cognitive scientists urge caution, noting that good behavioral mimicry doesn’t guarantee cognitive similarity; complex outputs don’t always reflect similar underlying processes—much like comparing a calculator to a human solving math problems.

To address the challenge of interpretability, other researchers are experimenting with much smaller neural networks. These micro-models, sometimes containing only a single artificial neuron, show remarkable predictive accuracy in behavioral experiments with animals and humans. Their small size allows for transparency: scientists can track each neuron’s activity and draw clearer connections between model structure and output. Still, each of these networks is limited to narrowly defined tasks, unlike the broader versatility of Centaur. The field faces a fundamental trade-off: larger models excel at prediction but elude understanding, while smaller models offer insight yet handle only simple problems. As artificial intelligence advances, the pace of predictive breakthroughs continues to outstrip our ability to interpret them, highlighting the enduring gulf between knowing what will happen and understanding why.

72

Impact Score

Samsung completes hbm4 development, awaits NVIDIA approval

Samsung says it has cleared Production Readiness Approval for its first sixth-generation hbm (hbm4) and has shipped samples to NVIDIA for evaluation. Initial samples have exceeded NVIDIA’s next-gen GPU requirement of 11 Gbps per pin and hbm4 promises roughly 60% higher bandwidth than hbm3e.

NVIDIA and AWS expand full-stack partnership for Artificial Intelligence compute platform

NVIDIA and AWS expanded integration around Artificial Intelligence infrastructure at AWS re:Invent, announcing support for NVIDIA NVLink Fusion with Trainium4, Graviton and the Nitro System. the move aims to unify NVIDIA scale-up interconnect and MGX rack architecture with AWS custom silicon to speed cloud-scale Artificial Intelligence deployments.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.