This book presents the breakthrough and cutting-edge progress for collaborative perception and mapping by proposing a novel framework of multimodal perception-relative localization–collaborative mapping for collaborative robot systems. The organization of the book allows the readers to analyze, model and design collaborative perception technology for autonomous robots. It presents the basic foundation in the field of collaborative robot systems and the fundamental theory and technical guidelines for collaborative perception and mapping. The book significantly promotes the development of autonomous systems from individual intelligence to collaborative intelligence by providing extensive simulations and real experiments results in the different chapters. This book caters to engineers, graduate students and researchers in the fields of autonomous systems, robotics, computer vision and collaborative perception.
This book presents the breakthrough and cutting-edge progress for collaborative perception and mapping by proposing a novel framework of multimodal perception-relative localization–collaborative mapping for collaborative robot systems. The organization of the book allows the readers to analyze, model and design collaborative perception technology for autonomous robots. It presents the basic foundation in the field of collaborative robot systems and the fundamental theory and technical guidelines for collaborative perception and mapping. The book significantly promotes the development of autonomous systems from individual intelligence to collaborative intelligence by providing extensive simulations and real experiments results in the different chapters. This book caters to engineers, graduate students and researchers in the fields of autonomous systems, robotics, computer vision and collaborative perception.
Introduction.- Technical Background.- Point Registration Approach for Map Fusion.- Submap-Based Probabilistic Inconsistency Detection.- Hierarchical Map Fusion Framework with Homogeneous Sensors.- Collaborative 3D Mapping using Heterogeneous Sensors.- All-Weather Collaborative Mapping with Dynamic Objects.- Collaborative Probabilistic Semantic Mapping using CNN.
Yufeng Yue received the B.Eng. degree in automation from Beijing
Institute of Technology, Beijing, China, in 2014, the Ph.D. degree
from Nanyang Technological University, Singapore, in 2019. He
was a visiting scholar with University of California, Los Angeles
in 2019. He served as a research fellow with School of
Electrical and Electronic Eng., Nanyang Technological University,
Singapore, from 2018 to 2020. He is currently an associate
professor with School of Automation, Beijing Institute of
Technology, Beijing, China. His research interests
include perception, mapping and navigation for collaborative
autonomous systems in complex environments.
Danwei Wang is leading the autonomous mobile robotics research
group. He received his Ph.D and MSE degrees from the University of
Michigan, Ann Arbor in 1989 and 1984, respectively. He received his
B.E degree from the South China University of Technology, China in
1982. Since 1989, he has been with the School of Electrical and
Electronic Engineering, Nanyang Technological University,
Singapore. Currently, he is professor and the co-director of the ST
Engineering – NTU Corporate Laboratory. He is a senator in NTU
Academics Council. He has served as general chairman, technical
chairman and various positions in international conferences, such
as ICARCV and IROS conferences. He is an associate editor for the
International Journal of Humanoid Robotics and served as an
associate editor of Conference Editorial Board, IEEE Control
Systems Society from 1998 to 2005. He was a recipient of Alexander
von Humboldt fellowship, Germany. His research interests include
robotics, control theory and applications.
![]() |
Ask a Question About this Product More... |
![]() |