论文标题

MAPLAB 2.0-模块化和多模式映射框架

maplab 2.0 -- A Modular and Multi-Modal Mapping Framework

论文作者

Cramariuc, Andrei, Bernreiter, Lukas, Tschopp, Florian, Fehr, Marius, Reijgwart, Victor, Nieto, Juan, Siegwart, Roland, Cadena, Cesar

论文摘要

将多种传感器方式和深度学习与同时定位和映射(SLAM)系统的整合是当前研究中引起的重大兴趣的领域。多模式性是在具有挑战性的环境中实现鲁棒性以及具有不同传感器设置的异质多机器人系统的互操作性的垫脚石。借助Maplab 2.0,我们提供了一个多功能的开源平台,可促进开发,测试和将新的模块和功能集成到一个完整的大满贯系统中。通过广泛的实验,我们表明Maplab 2.0的精度与HILTI 2021基准测试的最新技术相媲美。此外,我们通过三种用例展示了系统的灵活性:i)大规模(约10 km)多机器人多课程(23个任务)映射,ii)非视觉地标的整合,iii),将语义基于对象的循环循环闭合模块纳入映射框架。该代码可在https://github.com/ethz-asl/maplab上提供开源。

Integration of multiple sensor modalities and deep learning into Simultaneous Localization And Mapping (SLAM) systems are areas of significant interest in current research. Multi-modality is a stepping stone towards achieving robustness in challenging environments and interoperability of heterogeneous multi-robot systems with varying sensor setups. With maplab 2.0, we provide a versatile open-source platform that facilitates developing, testing, and integrating new modules and features into a fully-fledged SLAM system. Through extensive experiments, we show that maplab 2.0's accuracy is comparable to the state-of-the-art on the HILTI 2021 benchmark. Additionally, we showcase the flexibility of our system with three use cases: i) large-scale (approx. 10 km) multi-robot multi-session (23 missions) mapping, ii) integration of non-visual landmarks, and iii) incorporating a semantic object-based loop closure module into the mapping framework. The code is available open-source at https://github.com/ethz-asl/maplab.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源