Automatic landing of a low-cost quadrotor using monocular vision and Kalman filter in GPS-denied environments

Automatic landing of a low-cost quadrotor using monocular vision and Kalman filter in GPS-denied environments

Unmanned aerial vehicles are becoming an important part of the modern life. Despite some recent advancesin GPS-aided navigation of quadrotors, the concern of crash and collision still overshadows their reliability and safety,especially in GPS-denied environments. Therefore, the necessity for developing fully automatic methods for safe, accurate,and independent landing of drones increases over time. This paper investigates the autolanding process by focusing on anaccurate and continuous position estimation of the drone using a monocular vision system and the fusion with the inertialmeasurement unit and ultrasonic sensors’ data. An ARUCO marker is used as the landing pad, and the information isprocessed in the ground station through a real-time Wi-Fi link. In order to overcome the closed loop instability caused bythe communication and localization delays, we propose a method called ”movement slicing method”. This method dividesthe moves around the marker into moving and waiting slices and makes the landing process not only more accurate butalso faster. Experimental results show a successful landing of the UAV on a predefined location, while it is accuratelyaligned with the marker using the proposed method.

___

  • [1] Rao DV, Go TH. Automatic landing system design using sliding mode control. Aerospace Science and Technology 2014; 32 (1): 180-7. doi: 10.1016/j.ast.2013.10.001
  • [2] Daly JM, Ma Y, Waslander SL. Coordinated landing of a quadrotor on a skid-steered ground vehicle in the presence of time delays. Autonomous Robots 2015; 38 (2): 179-191. doi: 10.1007/s10514-014-9400-5
  • [3] Junaid A Bin, Lee Y, Kim Y. Design and implementation of autonomous wireless charging station for rotary-wing UAVs. Aerospace Science and Technology 2016; 54: 253-266. doi: 10.1016/j.ast.2016.04.023
  • [4] Mulgaonkar Y. Automated recharging for persistence missions with multiple micro aerial vehicles. PhD, University of Pennsylvania, Philadelphia, PA, USA, 2012.
  • [5] Cocchioni F, Pierfelice V, Benini A, Mancini A, Frontoni E, Zingaretti P et al. Unmanned ground and aerial vehicles in extended range indoor and outdoor missions. In: IEEE 2014 International Conference on Unmanned Aircraft Systems; Orlando, FL, USA; 2014. pp. 374-382.
  • [6] Achtelik M, Zhang T, Kuhnlenz K, Buss M. Visual tracking and control of a quadcopter using a stereo camera system and inertial sensors. In: IEEE 2009 Mechatronics and automation; Changchun, China; 2009. pp. 2863-2869.
  • [7] Wenzel KE, Masselli A, Zell A. Automatic take off, tracking and landing of a miniature UAV on a moving carrier vehicle. Journal of Intelligent & Robotic Systems 2011; 61 (1-4): 221-238. doi: 10.1007/s10846-010-9473-0
  • [8] Yu K, Budhiraja AK, Buebel S, Tokekar P. Algorithms and experiments on routing of unmanned aerial vehicles with mobile recharging stations. Journal of Field Robotics 2018; 1-15. doi: 10.1002/rob.21856
  • [9] Saska M, Krajnik T, Pfeucil L. Cooperative µUAV-UGV autonomous indoor surveillance. In: IEEE 2012 Systems, Signals and Devices; Chemnitz, Germany; 2012. pp. 1-6.
  • [10] Borowczyk A, Nguyen DT, Phu-Van Nguyen A, Nguyen DQ, Saussié D, Le Ny J. Autonomous landing of a multirotor micro air vehicle on a high velocity ground vehicle. IFAC-PapersOnLine 2017; 50 (1): 10488-10494. doi: 10.1016/j.ifacol.2017.08.1980
  • [11] Ling K. Precision Landing of a Quadrotor UAV on a Moving Target Using Low-Cost Sensors. Msc, University of Waterloo, Waterloo, Ontario, Canada, 2014.
  • [12] Sani MF, Karimian G. Automatic navigation and landing of an indoor AR. drone quadrotor using ArUco marker and inertial sensors. In: IEEE 2017 Computer and Drone Applications; Kuching, Malaysia; 2017. pp. 102-107.
  • [13] Bi Y, Duan H. Implementation of autonomous visual tracking and landing for a low-cost quadrotor. Optik 2013; 124 (18): 3296-3300. doi: 10.1016/j.ijleo.2012.10.060
  • [14] Ginkel Rv, Meerman I, Mulder T, Peters J. Autonomous Landing of a Quadcopter on a Predefined Marker. Amsterdam, the Netherlands: University of Amsterdam, Project Report, 2013.
  • [15] Yu C, Cai J, Chen Q. Multi-resolution visual fiducial and assistant navigation system for unmanned aerial vehicle landing. Aerospace Science and Technology 2017; 67: 249-256. doi: 10.1016/j.ast.2017.03.008
  • [16] Carreira TG. Quadcopter automatic landing on a docking station. Msc, University of Lisbon, Portugal, 2013.
  • [17] Benavidez PJ, Lambert J, Jaimes A, Jamshidi M. Landing of a Quadcopter on a mobile base using fuzzy logic. In: WCSC 2013 Advance Trends in Soft Computing; San Antonio, TX, USA; 2014. pp. 429-437.
  • [18] Altan A, Aslan Ö, Hacıoğlu R. Real-time control based on NARX neural network of hexarotor UAV with load transporting system for path tracking. In: IEEE 2018 International Conference on Control Engineering & Information Technology; Istanbul, Turkey; 2018.
  • [19] Shah S. Real-time image processing on low cost embedded computers. California, CA, USA: University of California, Berkeley, Techincal report No. UCB/EECS-2014–117, 2014.
  • [20] Shin YH, Lee S, Seo J. Autonomous safe landing-area determination for rotorcraft UAVs using multiple IR-UWB radars. Aerospace Science and Technology 2017; 69: 617-624. doi: 10.1016/j.ast.2017.07.018
  • [21] Forster C, Faessler M, Fontana F, Werlberger M, Scaramuzza D. Continuous on-board monocular-vision-based elevation mapping applied to autonomous landing of micro aerial vehicles. In: IEEE 2015 International Conference on Robotics and Automation; Seattle, WA, USA; 2015. pp. 111-118.
  • [22] Mitra S, Land B. Autonomous quadcopter docking system. Ithaca, NY, USA: Cornell University, Report, 2013.
  • [23] Podhradsky M. Visual servoing for a quadcopter flight control. Msc, Luleå University of Technology, Sweden, 2012.
  • [24] Lange S, Sünderhauf N, Protzel P. Autonomous landing for a multirotor UAV using vision. In: 2008 International Conference on Simulation, Modeling, and Programming for Autonomous Robots; Venice, Italy; 2008 pp. 482-491.
  • [25] Krajník T, Vonásek V, Fišer D, Faigl J. AR-drone as a platform for robotic research and education. In: 2011 International Conference on Research and Education in Robotics; Berlin, Heidelberg, Germany; 2011. pp. 172-186.
  • [26] Sharp CS, Shakernia O, Sastry SS. A vision system for landing an unmanned aerial vehicle. In: IEEE 2001 Robotics and Automation; Seoul, South Korea; 2001. pp. 1720-1727.
  • [27] Garrido-Jurado S, Muñoz-Salinas R, Madrid-Cuevas FJ, Marín-Jiménez MJ. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognition 2014; 47 (6): 2280-2292. doi: 10.1016/j.patcog.2014.01.005
  • [28] Chen Y, Liu HL. Overview of landmarks for autonomous, vision-based landing of unmanned helicopters. IEEE Aerospace and Electronic Systems Magazine 2016; 31 (5): 14-27. doi: 10.1109/maes.2016.150053
  • [29] Bowditch JI. The New American Practical Navigator. New York, NY, USA: E. & GW Blunt, 1857.
  • [30] Kalman RE. A new approach to linear filtering and prediction problems. Transactions of ASME, Series D, Journal of Basic Engineering 1960; 82 (1): 35-45. doi: 10.1115/1.3662552
  • [31] Maybeck PS. Stochastic Models, Estimation, and Control. New York, NY, USA: Academic Press, 1982.
  • [32] Bradski G, Kaehler A. Learning OpenCV: Computer Vision with the OpenCV Library. Sebastopol, CA, USA: O’Reilly Media Inc. 2008.
  • [33] Diebel J. Representing attitude: Euler angles, unit quaternions, and rotation vectors. Matrix 2006; 58 (15-16): 1-35.
  • [34] Suzuki S. Topological structural analysis of digitized binary images by border following. Computer Vision, Graphics, and Image Processing 1985; 30 (1): 32-46. doi: 10.1016/0734-189x(85)90016-7
  • [35] Baggio DL. Mastering OpenCV with Practical Computer Vision Projects. Birmingham, UK: Packt Publishing Ltd, 2012.
  • [36] Otsu N. A threshold selection method from gray-level histograms. IEEE Transactions on Systems Man and Cybernetics 1979; 9 (1): 62-66. doi: 10.1109/TSMC.1979.4310076
  • [37] Zhang Z. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence 2000; 22 (11): 1330-1334. doi: 10.1109/34.888718
  • [38] Lepetit V, Moreno-Noguer F, Fua P. Epnp: an accurate o(n) solution to the PnP problem. International Journal of Computer Vision. 2009; 81 (2): 155-166. doi: 10.1007/s11263-008-0152-6
  • [39] Patruno C, Nitti M, Petitti A, Stella E, D’Orazio T. A vision-based approach for unmanned aerial vehicle landing, Journal of Intelligent & Robotic Systems 2018; 1-20. doi: 10.1007/s10846-018-0933-2