Have you ever wondered how insects manage to travel such long distances beyond their habitat and find their way back? The answer to this question is not only relevant for biology, but also for the design of AI for tiny autonomous robots. Drone researchers at Delft University of Technology were inspired by biological findings about how ants visually recognize their environment and combine this with step counting to find their way home safely. They used this knowledge to create an insect-inspired autonomous navigation strategy for tiny lightweight robots. This strategy allows these robots to return home after long trajectories, while requiring extremely little computation and memory (0.65 kilobytes per 100 m). In the future, tiny autonomous robots could find a wide range of uses, from monitoring inventory in warehouses to searching for gas leaks in industrial sites. The researchers published their results in Science Robotics on July 17, 2024.
Defend the little ones
Miniature robots, weighing a few dozen to a few hundred grams, have the potential to find interesting applications in the real world. Thanks to their low weight, they are extremely safe even if they accidentally bump into someone. Because they are small, they can move in narrow areas. And if they can be manufactured more cheaply, they can be deployed in larger numbers, so that they can quickly cover a large area, for example in greenhouses to detect pests or diseases early.
However, it is difficult to operate such small robots autonomously, as they have extremely limited resources compared to larger robots. One of the main obstacles is that they must be able to move on their own. For this, robots can rely on external infrastructure. They can use location estimates from GPS satellites outdoors or wireless communication beacons indoors. However, relying on such infrastructure is often not desirable. GPS is not available indoors and can become very inaccurate in cluttered environments such as urban canyons. Furthermore, installing and maintaining beacons in indoor spaces is quite expensive or simply impossible, for example in search and rescue scenarios.
The artificial intelligence needed for autonomous navigation with onboard resources alone was designed for large robots, such as self-driving cars. Some approaches rely on heavy, power-hungry sensors, such as LiDAR laser rangefinders, which simply cannot be carried or powered by small robots. Other approaches use the sense of vision, a very power-efficient sensor that provides rich information about the environment. However, these approaches typically attempt to create highly detailed 3D maps of the environment. This requires large amounts of processing and memory, which can only be provided by computers that are too large and power-hungry for tiny robots.
Counting steps and visual breadcrumbs
That’s why some researchers have looked to nature for inspiration. Insects are particularly interesting because they move over distances that could be relevant for many real-world applications, while using very limited sensing and computational resources. Biologists are increasingly understanding the underlying strategies that insects use. Specifically, insects combine tracking their own movements (called “odometry”) with visually guided behaviors based on their low-resolution but nearly omnidirectional visual system (called “visual memory”). While odometry is increasingly well understood, even down to the neuronal level, the precise mechanisms underlying visual memory are even less well understood. One early theory of how it works proposes a “snapshot” model. In this model, an insect such as an ant is thought to take occasional snapshots of its environment. Later, when it arrives near the snapshot, the insect can compare its current visual perception to the snapshot and move to minimize the differences. This allows the insect to navigate, or “home,” to the snapshot location, removing any drift that inevitably accumulates when performing only odometry.
“Snapshot navigation can be compared to how Hansel tried not to get lost in the fairy tale of Hansel and Gretel. When Hans threw stones on the ground, he could find his way home. However, when he threw bread crumbs that were eaten by birds, Hans and Gretel got lost. In our case, the stones are the snapshots,” explains Tom van Dijk, first author of the study. “As with a stone, for a snapshot to work, the robot must be close enough to the snapshot location. If the visual environment is too different from the snapshot location, the robot may move in the wrong direction and never return. So you have to use enough snapshots – or, in Hansel’s case, drop enough stones. On the other hand, dropping stones too close together would exhaust Hans’ stones too quickly.” In the case of a robot, using too many snapshots leads to significant memory consumption. Previous work in this area typically had snapshots very close to each other, so the robot could visually navigate to one snapshot first and then to the next one.
“The main idea behind our strategy is that you can space snapshots much further apart if the robot moves between snapshots based on odometry,” says Guido de Croon, a professor of bio-inspired drones and co-author of the paper. “The return-to-home will work as long as the robot ends up close enough to the snapshot location, that is, as long as the robot’s odometric drift is within the snapshot capture area. It also allows the robot to move much further, because it flies much slower when traveling to a snapshot than when flying from one snapshot to another based on odometry.”
The proposed insect-inspired navigation strategy allowed a 56-gram “CrazyFlie” drone, equipped with an omnidirectional camera, to cover distances of up to 100 meters using just 0.65 kilobytes. All the visual processing was done on a tiny computer called a “microcontroller,” which is found in many inexpensive electronic devices.
Putting robotic technology to work
“The proposed insect-inspired navigation strategy is an important step towards the real-world application of small autonomous robots,” says Guido de Croon. “The functionality of the proposed strategy is more limited than that provided by state-of-the-art navigation methods. It does not generate a map and only allows the robot to return to the starting point. Yet for many applications, this may be more than sufficient. For example, for inventory tracking in warehouses or crop monitoring in greenhouses, drones could take off, collect data, and then return to the base station. They could store mission-relevant images on a small SD card for post-processing by a server. But they would not need it for navigation itself.”
/Public dissemination. This content from the original organization/authors may be of a timely nature and edited for clarity, style, and length. Mirage.News takes no institutional position or bias, and all views, positions, and conclusions expressed herein are solely those of the author(s). See the full story here.