论文标题
规范自动机器人系统的安全性和安全性
Regulating Safety and Security in Autonomous Robotic Systems
论文作者
论文摘要
自主机器人系统本质上是安全至关重要的,并且要考虑复杂的安全问题(例如,安全故障可能导致安全故障)。在部署它们之前,这些系统必须显示证据表明它们遵守一套监管机构定义的安全和保障规则。正式方法为证明系统遵守规则的系统提供了强大的方法,但是正式化(通常是自然语言)规则可能很难。专门针对自主系统的法规仍在制定中,但是当试图证明自主系统是安全的时,人类操作员的安全规则是一个很好的起点。对于无人驾驶汽车和无飞行飞机等自治系统的应用,人类操作员有明确的规则,这些规则已被正式化,并用来证明自主系统遵守某些或全部这些规则。但是,在空间和核部门的应用中,应用更可能有所不同,因此已经发展了一套一般安全原则。这允许对其安全性进行评估,但很难形式化。为了改善这种情况,我们正在与空间和核部门的监管机构和社区合作,以制定可符合可靠(正式)验证的自动和机器人系统的准则。这些活动还可以弥合空间或核社区和学术界内的知识差距。
Autonomous Robotics Systems are inherently safety-critical and have complex safety issues to consider (for example, a safety failure can lead to a safety failure). Before they are deployed, these systems of have to show evidence that they adhere to a set of regulator-defined rules for safety and security. Formal methods provide robust approaches to proving a system obeys given rules, but formalising (usually natural language) rules can prove difficult. Regulations specifically for autonomous systems are still being developed, but the safety rules for a human operator are a good starting point when trying to show that an autonomous system is safe. For applications of autonomous systems like driverless cars and pilotless aircraft, there are clear rules for human operators, which have been formalised and used to prove that an autonomous system obeys some or all of these rules. However, in the space and nuclear sectors applications are more likely to differ, so a set of general safety principles has developed. This allows novel applications to be assessed for their safety, but are difficult to formalise. To improve this situation, we are collaborating with regulators and the community in the space and nuclear sectors to develop guidelines for autonomous and robotic systems that are amenable to robust (formal) verification. These activities also have the benefit of bridging the gaps in knowledge within both the space or nuclear communities and academia.