论文标题
通过摄像机运动平滑对视觉感知模型的鲁棒性认证
Robustness Certification of Visual Perception Models via Camera Motion Smoothing
论文作者
论文摘要
大量文献表明,基于学习的视觉感知模型对对抗性噪音敏感,但是很少有作品认为在广泛存在的摄像机运动扰动下机器人感知模型的稳健性。为此,我们研究了相机运动扰动下视觉感知模型的鲁棒性,以研究摄像机运动对机器人感知的影响。具体而言,我们为任意图像分类模型提出了一种运动平滑技术,在相机运动下的鲁棒性可以得到认证。提出的基于相机运动平滑的稳健性认证框架为视觉感知模块提供了紧密且可扩展的鲁棒性保证,因此它们适用于宽的机器人应用。据我们所知,这是第一项为针对摄像机动作提供鲁棒性认证的鲁棒性认证的工作,从而提高了机器人感知的可信度。介绍了一个逼真的室内机器人数据集,其中介绍了整个房间的密集点云图,Metaroom被引入了具有挑战性的可认证的可靠感知任务。我们进行了广泛的实验,以通过对摄像机运动扰动进行运动平滑来验证认证方法。我们的框架保证了在-0.1m 〜0.1m内沿深度方向的相机翻译扰动的认证精度为81.7%。我们还通过用眼镜的摄像头在机器人臂上进行硬件实验来验证方法对现实世界机器人的有效性。该代码可在https://github.com/hanjianghu/camera-motion-moothing上找到。
A vast literature shows that the learning-based visual perception model is sensitive to adversarial noises, but few works consider the robustness of robotic perception models under widely-existing camera motion perturbations. To this end, we study the robustness of the visual perception model under camera motion perturbations to investigate the influence of camera motion on robotic perception. Specifically, we propose a motion smoothing technique for arbitrary image classification models, whose robustness under camera motion perturbations could be certified. The proposed robustness certification framework based on camera motion smoothing provides tight and scalable robustness guarantees for visual perception modules so that they are applicable to wide robotic applications. As far as we are aware, this is the first work to provide robustness certification for the deep perception module against camera motions, which improves the trustworthiness of robotic perception. A realistic indoor robotic dataset with a dense point cloud map for the entire room, MetaRoom, is introduced for the challenging certifiable robust perception task. We conduct extensive experiments to validate the certification approach via motion smoothing against camera motion perturbations. Our framework guarantees the certified accuracy of 81.7% against camera translation perturbation along depth direction within -0.1m ~ 0.1m. We also validate the effectiveness of our method on the real-world robot by conducting hardware experiments on the robotic arm with an eye-in-hand camera. The code is available at https://github.com/HanjiangHu/camera-motion-smoothing.