Researchers at the Massachusetts Institute of Technology (MIT) presented a project at the International Symposium on Experimental Robotics involving an autonomous drone fleet system that collaboratively mapped an environment under dense forest canopy.
Designed with search and rescue in mind, the drones used lidar, onboard computation and wireless communication, with no requirement for GPS positioning.
Each drone carries laser-range finders for position estimation, localization and path planning. As it flies, each drone creates its own 3-D map of the terrain. A ground station uses simultaneous localization and mapping (SLAM) technology to combine individual maps from multiple drones into a global 3-D map that can be monitored by operators.
The MIT team tested its concept via simulations of randomly generated forests, and world-tested two drones in a forested area at NASA’s Langley Research Center. In both experiments, each drone mapped a roughly 20-square-meter area in about two to five minutes, while the control system integrated their maps together in real-time.
The drones were programmed to identify multiple trees’ orientations, as recognizing individual trees in impossible for the technology, and individual trees’ orientation very difficult. When the lidar signal returns a cluster of trees, an algorithm calculates the angles and distances between trees to identify the cluster and determine if it has already been identified and mapped, or is a new mini-environment.
The technique also aids in merging maps from the separate drones. When two drones scan the same cluster of trees, the ground station merges the maps by calculating the relative transformation between the drones, and then fusing the individual maps to maintain consistent orientations.