The way insects see and track their
prey1 is being
applied2 to a new robot under development at the University of Adelaide, in the hopes of improving robot visual systems. The project - which crosses the boundaries of neuroscience, mechanical engineering and computer science - builds on years of research into insect vision at the University.
In a new paper published today in the Journal of The Royal Society
Interface3, researchers describe how the learnings from both insects and humans can be applied in a model virtual reality simulation, enabling an artificial intelligence system to 'pursue' an object.
"Detecting and tracking small objects against complex backgrounds is a highly challenging task," says the lead author of the paper, Mechanical Engineering PhD student Zahra Bagheri.
"Consider a cricket or baseball player trying to take a match-winning catch in the outfield. They have seconds or less to spot the ball, track it and predict its path as it comes down against the brightly coloured backdrop of excited fans in the crowd - all while running or even diving towards the point where they predict it will fall!
"Robotics engineers still dream of providing robots with the combination of sharp eyes, quick reflexes and flexible muscles that allow a budding champion to master this skill," she says.
Research conducted in the lab of University of Adelaide neuroscientist Dr Steven Wiederman (School of Medical Sciences) has shown that flying insects, such as dragonflies, show
remarkable4 visually guided behaviour. This includes chasing mates or prey, even in the presence of
distractions5, like
swarms6 of insects.
"They perform this task despite their low visual
acuity7 and a tiny brain, around the size of a grain of rice. The dragonfly chases prey at speeds up to 60 km/h, capturing them with a success rate over 97%," Ms Bagheri says.