The Secret to Smarter Robots: Ants

By Matthew Braga

What robots can learn from ant swarm behavior, and how roboticists are already applying lessons from ant studies to autonomous drones.

Your cat is stuck in a burning building too dangerous for rescue crews to go inside, so off go the drones instead – five little unmanned aerial models that hover and flit through fiery beams and door frames without any human control. They know to spread out to cover more ground, and know how to adjust their search patterns when the communication links with the other drones go down. Their algorithms find and retrieve your cat in what rescue crews tell you is record time.

Or that's the dream anyhow, to one day build artificially intelligent, self-organizing robot systems that can collaborate on complex tasks – or, at the very least, rescue imperiled cats. We're not there yet, but researchers have been getting closer, thanks in part to what we're learning from the collective behavior of ants.

Photo credit: National Geographic

Look back through artificial intelligence literature from the past few decades and you'll find ant-inspired algorithms are a popular topic of study. Of note, Swiss artificial intelligence researcher Marco Dorigo was the first to algorithmically model ant colony behavior in the early 1990, and Stanford University biologist Deborah Gordon published her own study on the expandable search networks of ants a few years after. Today, both have different but related ideas on how we might implement so-called ant-inspired swarm intelligence in robots – and perhaps soon, drones – outside of the lab.

Consider, for example, how ants explore and search. Ants change the way they scour for things such as food and water depending on the number of ants nearby. According to Gordon, if there is a high density of ants in an area, the ants search more thoroughly in small, random circles. If there are fewer ants, the ants adjust their paths to be straighter and longer, allowing them to cover more ground.

Photo credit: NASA

This is all well and good in typical ant environments – but how do the ants adapt when interference is introduced, and their communication with other ants interrupted? To find out, Gordon sent over 600 small, black pavement crawlers to the International Space Station in January, and believes that studying how they react to the unfamiliar microgravity of space could help build better robots. Her research is especially prescient in the age of the drone.

In a Stanford news release, Gordon likened the interference introduced by microgravity as "analogous to the radio disruption that robots might experience in a blazing building." Depending on how Gordon's space ants adapt, she thinks the results when applied to robotics and artificial intelligence could help us program more efficient algorithms for search and exploration – especially when our robots are faced with unfamiliar environments, and with little to no human control.

"We don’t have the results yet," wrote Gordon in an email, "but it is clear that the relation between encounter rate and density was different in microgravity, because sometimes an ant just lost its hold on the surface, went spinning around in the air and came down somewhere else."

Logistics companies have used Ant Colony Optimization to make their deliveries more efficient, training human drivers to behave more like ants.

At the Université libre de Bruxelles, Dr. Dorigo is doing different work, but with a similar goal. In his initial research on Ant Colony Optimization, Dr. Dorigo focused not on how ants could expand their search networks, but the behavioral algorithms that ants employ to follow the most efficient path to a goal. In short, once an ant finds something good, it leaves a trail of pheromones behind for other ants to follow too. But pheromones evaporate over time. Thus, the shorter the path, the stronger the path will smell, and the more likely other ants will follow the path – amplifying its strength further, thus reinforcing its correctness, with their own pheromone trail too.

Early implementations of Ant Colony Optimization helped logistics companies make their deliveries more efficiently – essentially training human drivers to behave more like ants, which is actually pretty cool. But it didn't take long for Dr. Dorigo to apply his algorithms to robots, and the concept of swarm intelligence was born. From 2001 to 2005, Dr. Dorigo coordinated the Swarm-bots project, an early exploration into the feasibility of swarm robotics that required at least twenty small, identical robots to be "capable of self-assembling and self-organising to adapt to its environment." After its success, Dr. Dorigo followed with the Swarmanoid project from 2006 to 2010, that expanded on the Swarm-bots project by increasing both the number and types of robots involved.

An article in The Guardian earlier this year suggested that swarm robots could one day be used to restore coral ecosystems, clean oil spills, and for watering and harvesting crops. But naturally, one of the next steps is the implementation of swarm intelligence in drones, which can traverse varied environments and large expanses better than many land-based robots can. Already, researchers in Hungary have created a "flock" of drones that exhibit bird-like behaviors, while researchers at the University of Pennsylvania's GRASP Laboratory are beginning to apply ant swarm algorithms to aerial drones that "exhibit goal-driven behavior, while sensing and reacting to changing environment."

Of course, challenges certainly remain. We don't have a failproof way for drones to communicate with one another, ad-hoc, without relying on existing communications infrastructure, and there are the obvious regulatory hurdles – how most drone laws in places such as Canada and the U.S. require drones to remain within sight and control of a human operator, assuming their operation is even approved at all. There also remain ethical questions about what decisions we should allow drones to make on their own.

But the potential is there, crawling on the ground. And in the race to make smarter robots, there's still lots we can learn from ants.