New low-cost technology to prevent drone collision
Using only on-board sensors and cameras, researcher Julián Estévez, from the Computational Intelligence Group (GIC) of the University of the Basque Country (UPV/EHU) has developed low-cost, autonomous, navigation technology to prevent two or more drones whose paths cross in mid-air from colliding with each other. He has achieved positive, encouraging results.
A study using a set of drones has confirmed that "'despite the reduced cost of the technology, the solution we have developed has been successfully validated in commercial drones. Using simple, low-cost equipment and an algorithm based on artificial vision and color identification, we have developed a robust piece of technology to satisfactorily prevent collisions between drones that can be easily extrapolated to most commercial and research aerial robots; we have also made available the complete software code for the solution," said Estévez.
The work is published in the journal Aerospace Science and Technology.
Most of the drones we are familiar with are manned, even if they are outside the operator's view. For a drone to be fully autonomous, it has to be able to make flight decisions on its own without human intervention, in other words, to decide for itself how to avoid collisions, maintain its course in the face of wind gusts, control flight speed, dodge buildings, trees, etc.
"This work is a small step towards fully autonomous navigation, without any human intervention, so that drones can decide which maneuver to perform, which direction to take, thus preventing collisions with each other or with other airborne obstacles. If we assume that, in the future, our airspace will be much more populated by commercial services performed by these drones, our work is a small contribution in this respect," said Estévez.
The author explained that "our approach to preventing collisions does not require drones to exchange information with each other; instead, they rely solely on their on-board sensors and cameras. We get the signal from the camera on board the drones, and by processing the images, we adjust the reactions of the robots so that they fly smoothly and accurately."
In the experiments, the researchers tried to mimic realistic drone conditions, in other words, scenarios that can occur in a typical urban area under uncontrolled lighting conditions, with drones flying in different directions, etc., so their contributions are geared towards real-world applications, despite the initial laboratory work.
Color-based algorithms
"We equipped each drone with a red card that allows the software algorithm to detect the presence of an approaching drone and measure its proximity," explained Estévez. "Our proposal is very simple: each drone is equipped with an on-board camera, the screen of which is divided into two halves (left and right). This camera always seeks out the red color of the cards mentioned above.
"Through simple image processing, we can find out what percentage of the camera is occupied by the color red, and whether most of this red is on the left- or right-hand side of the screen. If most of the red zone is on the left-hand side of the screen, the drone will fly to the right to avoid collision. If the red zone is on the right, it will move to the left. And this happens with all airborne drones.
"When the percentage of the color red on the screen increases, it means that the drones are approaching each other head-on. So when a threshold is exceeded, the robot knows that it has to perform the avoidance maneuver.
"All this happens autonomously, without the human operator intervening. It's a simple way to prevent collisions, and can be performed by low-cost sensors and equipment," said Estévez. It is not unlike what happens when a person is walking down the street and sees someone approaching from the left, in which case the person tries to move to the right so that they do not collide with each other.