Insect-inspired algorithm improves performance of autonomous robots

Researchers from Delft University of Technology The researchers were inspired by biological findings about how ants use their vision and ability to count their steps to find their way home safely. They developed an autonomous navigation method for small, lightweight robots based on these insect behaviors. Their results were published in the journal Science Robotics.

A small 56-gram “CrazyFlie” drone capable of returning to a starting point thanks to a navigation strategy inspired by insects. Photo credit: Delft University of Technology

This strategy allows these robots to return home after long journeys, requiring minimal computing and memory (1.16 kilobytes per 100 m). In the future, these small autonomous robots could have many applications, from monitoring inventory in warehouses to detecting gas leaks in industrial sites.

Defend the little ones

Small robots, weighing from a few hundred to a few dozen grams, have many interesting practical applications. Due to their low weight, they are very safe, even if they accidentally bump into someone. They can also maneuver in confined spaces.

Additionally, while they are inexpensive to produce, they can be used in larger quantities to quickly cover a large area, such as in greenhouses, for early identification of pests or diseases. Because they have far fewer resources than larger robots, it is difficult to build robots that are this small and capable of operating autonomously.

One of the main obstacles to using small robots is that they must be able to navigate autonomously to achieve real-world applications. Robots can thus benefit from the help of external infrastructure. They can take advantage of internal wireless communication beacons or external GPS satellites to determine their approximate position.

However, depending on the type of infrastructure, this solution is often not desired. Indoors, GPS is unreliable, and in crowded areas like urban canyons, it can become very inaccurate. Additionally, it can be very expensive or impossible to build and maintain beacons in indoor locations, such as in search and rescue situations.

The artificial intelligence needed to make onboard resources the sole means of navigation was developed with large robots, such as self-driving cars, in mind. Some methods require bulky, power-hungry sensors, such as LiDAR laser rangefinders, which are difficult for small robots to transport or power.

Other methods exploit vision, a very energy-efficient sensor that offers a wealth of environmental data. However, these methods generally aim to produce extremely comprehensive three-dimensional environmental maps.

Other methods exploit vision, a very energy-efficient sensor that offers a wealth of environmental data. However, these methods generally aim to produce extremely comprehensive three-dimensional environmental maps.

Processing and memory capabilities are needed for this, but the only devices capable of providing them are computers, which are too big and power-hungry for small robots.

Counting steps and visual breadcrumbs

For this reason, many researchers have turned to the natural world for ideas. Because they use extremely limited sensing and computational resources, insects are particularly interesting because they can operate over distances that could be important for many real-world applications.

Scientists who study biology are learning more and more about the fundamental tactics employed by insects. In particular, insects use their low-resolution, nearly omnidirectional visual system, called “visual memory,” to combine tracking of their own movements, or “odometry,” with visually guided behaviors.

The specific mechanisms of visual memory remain poorly understood, in contrast to the growing understanding of odometry, even at the neuronal level. Thus, there are several conflicting views on how insects use their vision to navigate. One early theory proposes a “snapshot” model. This idea suggests that an insect, like an ant, would periodically take pictures of its environment.

The insect can then move to minimize the mismatch between its current visual perception and the snapshot as it approaches it. In this way, any drift that inevitably accumulates when performing odometry alone is eliminated and the insect is able to navigate, or “home,” to the snapshot location.

Snapshot navigation can be compared to how Hansel tried not to get lost in the fairy tale of Hansel and Gretel. When Hans threw stones on the ground, he could find his way home. However, when he threw bread crumbs that were eaten by birds, Hans and Gretel got lost. In our case, the stones are the snapshots.

Tom van Dijk, first author of the study, Delft University of Technology

Tom van Dijk continued: “As with a stone, for a snapshot to work, the robot must be close enough to the snapshot location. If the visual environment is too different from the snapshot location, the robot may move in the wrong direction and never return. So enough snapshots must be used, or in Hansel’s case, enough stones must be dropped. On the other hand, dropping stones too close together would exhaust Hans’ stones too quickly..”

Dijk said: “In the case of a robot, using too many snapshots leads to significant memory consumption. Previous work in this area typically had snapshots very close to each other, so the robot could visually navigate first to one snapshot and then to the next..”

The main idea behind our strategy is that you can space snapshots much further apart, if the robot moves between snapshots based on odometry. The home return will work as long as the robot is close enough to the snapshot location, i.e. as long as the robot’s odometry drift is within the snapshot capture area. This also allows the robot to move much further, because the robot flies much slower when it travels home to a snapshot than when it flies from one snapshot to another based on odometry..

Guido de Croon, full professor and co-author of the study, Delft University of Technology

At just 1.16 kilobytes, a 56-gram “CrazyFlie” drone equipped with an omnidirectional camera was able to fly up to 100 meters using the suggested insect-inspired navigation method. A tiny computer called a “microcontroller,” found in many cheap electrical gadgets, handles all the visual processing.

Putting robotic technology to work

The proposed insect-inspired navigation strategy is an important step towards the real-world application of small autonomous robots. The functionality of the proposed strategy is more limited than that provided by state-of-the-art navigation methods. It does not generate a map and only allows the robot to return to the starting point. Nevertheless, for many applications, this may be more than sufficient..

Guido de Croon, full professor and co-author of the study, Delft University of Technology

Croon said: “For example, for tracking inventory in warehouses or monitoring crops in greenhouses, drones could take off, collect data, and then return to the base station. They could store the mission-relevant images on a small SD card for post-processing by a server. But they wouldn’t need them for navigation itself..”

Visual Route Tracking for Small Autonomous Robots – Science Robotics

Video credit: Delft University of Technology

Journal reference:

Tom van Dijk, Vermont, et al. (2024) Visual route tracking for tiny autonomous robots. Scientific robotics. doi.org/10.1126/scirobotics.adk0310.

Leave a Comment