Collaborative Perception, Localization and Mapping for Autonomous Systems 9789811588594, 9789811588600

This book presents the breakthrough and cutting-edge progress for collaborative perception and mapping by proposing a no

593 93 8MB

English Pages 141 [149] Year 2021

Report DMCA / Copyright

DOWNLOAD FILE

Collaborative Perception, Localization and Mapping for Autonomous Systems
 9789811588594, 9789811588600

Table of contents :
Preface
Contents
1 Introduction
1.1 Background
1.1.1 Motivations
1.1.2 Challenges
1.2 Objective of This Book
1.3 Preview of Chapters
References
2 Technical Background
2.1 Collaborative Perception and SLAM
2.1.1 Single Robot Perception and SLAM
2.1.2 Multi-Robot SLAM
2.1.3 Multi-Robot Map Fusion
2.2 Data Registration and Matching
2.2.1 Registration of Sensor Data
2.2.2 Homogeneous Map Matching
2.2.3 Heterogeneous Map Matching
2.3 Collaborative Information Fusion
2.3.1 Map Inconsistency Detection
2.3.2 Probabilistic Information Integration
References
3 Point Registration Approach for Map Fusion
3.1 Introduction
3.2 OICP Algorithm
3.2.1 Uncertainty in Occupancy Probability
3.2.2 Uncertainty in Positional Value
3.3 Transformation Evaluation and Probability Fusion
3.3.1 Transformation Evaluation
3.3.2 Relative Entropy Filter
3.4 Experimental Results
3.4.1 Registration Results
3.4.2 Transformation Evaluation
3.4.3 Probabilistic Map Fusion
3.5 Conclusions
References
4 Hierarchical Map Fusion Framework with Homogeneous Sensors
4.1 Introduction
4.2 System Overview
4.3 Map Uncertainty Modeling
4.3.1 Individual Voxel Uncertainty
4.3.2 Structural Edge Uncertainty
4.3.3 Local Uncertainty Propagation
4.4 Two-Level Probabilistic Map Matching
4.4.1 The Formulation of Two-Level Probabilistic Map Matching Problem
4.4.2 Probabilistic Data Association
4.4.3 Error Metric Optimization
4.5 Transformation Evaluation and Probability Merging
4.5.1 Transformation Evaluation
4.5.2 Relative Entropy Filter
4.6 Experimental Results
4.6.1 Evaluation Protocol
4.6.2 Edge Matching Analysis
4.6.3 Full Map Matching Analysis
4.6.4 Statistical Testing and Map Merging
4.7 Conclusions
References
5 Collaborative 3D Mapping Using Heterogeneous Sensors
5.1 Introduction
5.2 Distributed Multi-Robot Map Fusion
5.2.1 System Architecture
5.2.2 System Framework Definition and Formulation
5.2.3 Map Fusion Definition and Formulation
5.3 Multi-Robot Map Matching
5.3.1 Mathematic Formulation
5.3.2 3D Occupancy Map Matching
5.3.3 E-Step
5.3.4 M-Step
5.4 Time-Sequential Map Merging
5.4.1 Uncertainty Propagation and Transformation
5.4.2 Uncertainty Merge
5.5 Experimental Results
5.5.1 Evaluation Protocol
5.5.2 Indoor Environment
5.5.3 Mixed Environment Ground Floor
5.5.4 Changi Exhibition Center
5.5.5 Unstructured Environment Environment
5.5.6 Analysis of Experiment Results
5.6 Conclusions
References
6 All-Weather Collaborative Mapping with Dynamic Objects
6.1 Introduction
6.2 Framework of Collaborative Dynamic Mapping
6.3 Multimodal Environmental Perception
6.3.1 Heterogeneous Sensors Calibration
6.3.2 Separation of Static and Dynamic Observations
6.4 Distributed Collaborative Dynamic Mapping
6.4.1 Single Robot Level Definition
6.4.2 Collaborative Robots Level Definition
6.5 Experiments
6.5.1 Experiments Overview
6.5.2 Daytime Unstructured Environment
6.5.3 Night-Time Unstructured Environment
6.5.4 Quantitative Analysis
6.6 Conclusions
References
7 Collaborative Probabilistic Semantic Mapping Using CNN
7.1 Introduction
7.2 System Framework
7.2.1 The Framework of Hierarchical Semantic 3D Mapping
7.2.2 Centralized Problem Formulation
7.2.3 Distributed Hierarchical Definition
7.3 Collaborative Semantic 3D Mapping
7.3.1 Multimodal Semantic Information Fusion
7.3.2 Single Robot Semantic Mapping
7.3.3 Collaborative Semantic Map Fusion
7.4 Experimental Results
7.4.1 Evaluation Overview
7.4.2 Open Carpark
7.4.3 Mixed Indoor Outdoor
7.4.4 UAV-UGV Mapping
7.4.5 Quantitative Analysis
7.5 Conclusion
References
8 Conclusions
8.1 Summary
8.2 Open Challenges

Polecaj historie