A new discovery of how bees use their flight movements to facilitate remarkably accurate learning and recognition of complex visual patterns could mark a major change in how next-generation AI is developed, according to a University of 葫芦影业 study.
By building a computational model - or a digital version of a bee's brain - researchers have discovered how the way bees move their bodies during flight helps shape visual input and generates unique electrical messages in their brains. These movements generate neural signals that allow bees to easily and efficiently identify predictable features of the world around them. This ability means bees demonstrate remarkable accuracy in learning and recognising complex visual patterns during flight, such as those found in a flower.
The model not only deepens our understanding of how bees learn and recognise complex patterns through their movements, but also paves the way for next-generation AI. It demonstrates that future robots can be smarter and more efficient by using movement to gather information, rather than relying on massive computing power.
Professor James Marshall, Director of the Centre of Machine Intelligence at the University of 葫芦影业 and senior author on the study, said: 鈥淚n this study we鈥檝e successfully demonstrated that even the tiniest of brains can leverage movement to perceive and understand the world around them. This shows us that a small, efficient system - albeit the result of millions of years of evolution - can perform computations vastly more complex than we previously thought possible.
鈥淗arnessing nature's best designs for intelligence opens the door for the next generation of AI, driving advancements in robotics, self-driving vehicles and real-world learning.鈥
The study, a collaboration with Queen Mary University of London, is published today in the journal eLife. It builds on the team's previous research into how bees use active vision - the process where their movements help them collect and process visual information. While their earlier work observed how bees fly around and inspect specific patterns, this new study provides a deeper understanding of the underlying brain mechanisms driving that behaviour.
The sophisticated visual pattern learning abilities of bees, such as differentiating between human faces, have long been understood; however the study鈥檚 findings shed new light on how pollinators navigate the world with such seemingly simple efficiency.
Dr. HaDi MaBouDi, lead author and researcher at the University of 葫芦影业, said: "In our previous work, we were fascinated to discover that bees employ a clever scanning shortcut to solve visual puzzles. But that just told us what they do; for this study, we wanted to understand how.
鈥淥ur model of a bee鈥檚 brain demonstrates that its neural circuits are optimised to process visual information not in isolation, but through active interaction with its flight movements in the natural environment, supporting the theory that intelligence comes from how the brain, bodies and the environment work together.
鈥淲e鈥檝e learnt that bees, despite having brains no larger than a sesame seed, don鈥檛 just see the world - they actively shape what they see through their movements. It鈥檚 a beautiful example of how action and perception are deeply intertwined to solve complex problems with minimal resources. This is something that has major implications for both biology and AI.鈥
The model shows that bee neurons become finely tuned to specific directions and movements as their brain networks gradually adapt through repeated exposure to various stimuli, refining their responses without relying on associations or reinforcement. This lets the bee's brain adapt to its environment simply by observing while flying, without requiring instant rewards. This means the brain is incredibly efficient, using only a few active neurons to recognise things, conserving both energy and processing power.
To validate their computational model, the researchers subjected it to the same visual challenges encountered by real bees. In a pivotal experiment, the model was tasked with differentiating between a 鈥榩lus鈥 sign and a 鈥榤ultiplication鈥 sign. The model exhibited significantly improved performance when it mimicked the real bees' strategy of scanning only the lower half of the patterns, a behaviour observed by the research team in a previous study.
Even with just a small network of artificial neurons, the model successfully showed how bees can recognise human faces, underscoring the strength and flexibility of their visual processing.
Professor Lars Chittka, Professor of Sensory and Behavioural Ecology at Queen Mary University of London, added: 鈥楽cientists have been fascinated by the question of whether brain size predicts intelligence in animals. But such speculations make no sense unless one knows the neural computations that underpin a given task.
鈥淗ere we determine the minimum number of neurons required for difficult visual discrimination tasks and find that the numbers are staggeringly small, even for complex tasks such as human face recognition. Thus insect microbrains are capable of advanced computations.鈥
Professor Mikko Juusola, Professor in System Neuroscience from the University of 葫芦影业's School of Biosciences and Neuroscience Institute said: 鈥淭his work strengthens a growing body of evidence that animals don鈥檛 passively receive information - they actively shape it.
鈥淥ur new model extends this principle to higher-order visual processing in bees, revealing how behaviourally driven scanning creates compressed, learnable neural codes. Together, these findings support a unified framework where perception, action and brain dynamics co-evolve to solve complex visual tasks with minimal resources - offering powerful insights for both biology and AI.鈥
By bringing together findings from how insects behave, how their brains work, and what the computational models show, the study shows how studying small insect brains can uncover basic rules of intelligence. These findings not only deepen our understanding of cognition but also have significant implications for developing new technologies.
Read the study in full in eLife: