Vision-based hybrid map-building and robot localization in unstructured and moderately dynamic environments
Rady, Sherine;
Abstract
This work focuses on developing efficient environment representation and localization for mobile robots. A solution-approach is proposed for hybrid map-building and localization, which suits operating environments with unstructuredness and moderate dynamics. The solution-approach is vision-based and includes two phases. In the first phase, the map-building reduces the domain of extracted point features from local places through an information-theoretic analysis. The analysis simultaneously selects the most distinctive features only. The selected features are further compressed into codewords. The uncompressed features are also tagged with their metric position. In such a way, a unified map is created with hybrid data representation. In the second phase, the map is used to localize the robot. For fast topological localization, features extracted from the local place are compared to the codewords. To extend the localization into a metric pose, triangulation is executed hierarchically for the identified topological place with the use of the positional metric data of features. To ensure accurate position estimate, the dynamics of the environment are detected through the spatial layout of features and are isolated at the metric localization level. The proposed map-building and localization solution enables for a fast hybrid localization without degenerating the accuracy of localization.
Other data
Title | Vision-based hybrid map-building and robot localization in unstructured and moderately dynamic environments | Authors | Rady, Sherine | Issue Date | 1-Mar-2016 | Publisher | SPRINGER-VERLAG BERLIN | Conference | Studies in Computational Intelligence | ISSN | 1860949X | DOI | 10.1007/978-3-319-14194-7_12 | Scopus ID | 2-s2.0-84956501981 | Web of science ID | WOS:000371923600013 |
Recommend this item
Similar Items from Core Recommender Database
Items in Ain Shams Scholar are protected by copyright, with all rights reserved, unless otherwise indicated.