Picture of New theory allows drones to see distances with one eye
General

New theory allows drones to see distances with one eye

  •   Published on

 

A new theory has been published that allows drones to see distances with a single camera. TU Delft’s Micro Air Vehicle Laboratory has found that drones approaching an object with an insect-inspired vision strategy become unstable at a specific distance from the object. Turning this weakness into a strength, drones can actually use the timely detection of that instability to estimate distance. The new theory, published online on 7 January in the journal Bioinspiration & Biomimetics, will enable further miniaturization of autonomous drones and provides a new hypothesis on flying insect behavior.

Press release

Please find the press release here.

The following high-res images can be freely used when including a link to this web page / a reference to the article as credit:

In-depth explanation

Roboticists look with envy at small, flying insects, such as honeybees or fruit flies. These small animals perform marvelous feats such as landing on a flower in gusty wind conditions. Biologists have unveiled that flying insects rely heavily on optical flow for such tasks. Optical flow captures the way in which objects move over an animal’s (or robot’s) image. For instance, when looking out of a train, trees close-by will move very quickly over the image (large optical flow), while a mountain in the distance will move slowly (small optical flow).

Optical flow captures the ratio between velocity and distance and hence lacks “scale”. This means that optical flow stays equal if the velocity of the observer and the distance to the observed object are scaled with the same constant. Indeed, this is why in old Science Fiction movies spectators could have the idea they were travelling with high speed towards an unknown planet, while in reality the film maker was probably moving a camera very slowly toward a ball of dried paper and glue.

Many biologists and bio-inspired roboticists believe that the absence of scale is not a problem at all. Insects seem to evade the matter of estimating height and velocity by following very straightforward control strategies directly based on optical flow. For instance, when landing they keep the flow and the expansion of the flow constant, so that they gradually slow down as they descend. Successfully introducing similar strategies in spacecraft or drones holds a huge potential, since optical flow can be measured with a single, energy efficient camera and the straightforward control strategies hardly require any computation.

In a Bioinspiration & Biomimetics article published online today, January 7 2016, it is shown that scale actually does matter. Namely, it matters for good control of a constant optical flow descent. A theoretical analysis is performed of a common control law for a constant optical flow landing. This control law thrusts more up if the robot descends too fast, and less up when it descends too slowly. How much it thrusts more or less up depends on the difference between the desired optical flow divergence D* and the measured optical flow divergence D: Thrust = Gain x (D* – D). The analysis of this control loop shows that as soon as the robot has even the tiniest of delays between sensing and acting, there is a point during the landing that the control system will get unstable. The reason for this is the absence of scale. Very high up, far away from the landing surface, a large gain is best for control. Close to the surface, a low gain should be used. This finding seems discouraging, as it suggests that even for constant optical flow landings, we need to know the height in order to select the right control gain…

However, the article also shows that the robot will always get unstable at the same distance from the landing surface. The core idea proposed in the article, then, is to have the robot detect the oscillations that occur before the system actually gets unstable. By detecting these oscillations, and knowing its control gain, the robot can actually determine its height!

A major way in which this can be used by robots is to detect when they have almost landed, so that they can trigger a final landing response. Moreover, interestingly, the theoretical analysis shows that it is not even necessary to land in order to know the height; a robot can also hover and increase its control gain until small oscillations arise. The control gain at that point allows the robot to know the height. In addition, the robot can know its height during an entire landing, as long as it lands “on the edge of oscillation”.

The scientific findings in the article can lead to even smaller autonomous drones, as it may take away the need for additional sensors such as sonar. Moreover, the theory provides a novel hypothesis on how flying insects can have specific distance-based behaviors, such as triggering a landing response at a given distance from the landing surface.

Article

The article has been published as open access in Bioinspiration and Biomimetics.

Code

Paparazzi code
MATLAB code – still cleaning it up, expected soon…

Related work

There exist two other types of strategies to estimate distances with optical flow from a single camera. First, one can model the effect that a drone’s actions have on its accelerations[1,2]. These predicted accelerations then provide the “scale” to optical flow. Second, recently it was found that during a constant divergence approach of an object, the thrust is proportional to the height from the landing surface [3]. Hence, the thrust can then be used as a stand-in for distance. Both these approaches assume that the drone is only accelerated by its own thrust. However, wind and wind gusts have a big effect on the motion of small drones as well. The novel, stability-based theory published today is based on self-induced oscillations and hence is much more robust against external disturbances.

[1] Corke P.I. and Good M.C. 1992 Dynamic effects in high-performance visual servoing Proc. IEEE Int. Conf. on Robotics and Automation 1838–43
[2] Asl H.J., Oriolo G. and Bolandi H. 2014 An adaptive scheme for image-based visual servoing of an underactuated UAV Int. J. Rob. Autom. 29 92–104
[3] van Breugel F., Morgansen K. and Dickinson M.H. 2014 Monocular distance estimation from optic flow during active landing maneuvers Bioinsp. Biomim. 9 025002
Print Friendly