Shaowu Yang and Sebastian A. Scherer and Konstantin Schauwecker and Andreas Zell

Onboard Monocular Vision for Landing of an MAV on a Landing Site Specified by a Single Reference Image

2013 International Conference on Unmanned Aircraft Systems (ICUAS'13), Atlanta, GA, USA, May, 2013, pp. 317-324


Abstract

This paper presents a real-time monocular vision solution for MAVs to autonomously search for and land on an arbitrary landing site. The autonomous MAV is provided with only one single reference image of the landing site with an unknown size before initiating this task. To search for such landing sites, we extend a well-known visual SLAM algorithm that enables autonomous navigation of the MAV in unknown environments. A multi-scale ORB feature based method is implemented and integrated into the SLAM framework for landing site detection. We use a RANSAC-based method to locate the landing site within the map of the SLAM system, taking advantage of those map points associated with the detected landing site. We demonstrate the efficiency of the presented vision system in autonomous flight, and compare its accuracy with ground truth data provided by an external tracking system.


Downloads and Links

[pdf] [pdf]


BibTeX

@inproceedings{YangsICUAS13,
  author = {Shaowu Yang and Sebastian A. Scherer and Konstantin Schauwecker and
	Andreas Zell},
  title = {{Onboard Monocular Vision for Landing of an MAV on a Landing Site
	Specified by a Single Reference Image}},
  booktitle = {2013 International Conference on Unmanned Aircraft Systems (ICUAS'13)},
  year = {2013},
  pages = {317--324},
  address = {Atlanta, GA, USA},
  month = {May},
  abstract = {This paper presents a real-time monocular vision solution for MAVs
	to autonomously search for and land on an arbitrary landing site.
	The autonomous MAV is provided with only one single reference image
	of the landing site with an unknown size before initiating this task.
	To search for such landing sites, we extend a well-known visual SLAM
	algorithm that enables autonomous navigation of the MAV in unknown
	environments. A multi-scale ORB feature based method is implemented
	and integrated into the SLAM framework for landing site detection.
	We use a RANSAC-based method to locate the landing site within the
	map of the SLAM system, taking advantage of those map points associated
	with the detected landing site. We demonstrate the efficiency of
	the presented vision system in autonomous flight, and compare its
	accuracy with ground truth data provided by an external tracking
	system.},
  days = {28-31},
  pdf = {http://www.cogsys.cs.uni-tuebingen.de/publikationen/2013/ICUAS13_yangs_final.pdf},
  url = {http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6564704}
}