Safety Analysis and Testing

Off

Cooperative robots aim to allow human operators and robot workers to share the same work-space, and work jointly to achieve a common goal. Because of safety concerns, however, users tend to build physical barriers to isolate robots from the operators [1]. These barriers tend to limit the level of cooperation between humans and robots, and reduce the advantages of deploying cobots in the industrial space. We consider safety aspects in the context of the CSI: Cobot project, in particular investigating the impact of changes in the system setup. The project proposes novel sensing and control techniques to improve cobots’ awareness of their environment, especially regarding interactions with human operators. A crucial requirement for the adoption of such techniques is achieving confidence in the overall safety of the system. 

Simulation-based testing, through the Digital Twin environment, provides a good  wayto evaluate the safety of a cobot system, while only endangering virtual operators. The Digital Twin further offers the possibility to assess the behaviour of new sensors and controllers before their deployment in the actual system. Our approach relies on situation-coverage to drive the generation of test cases. Situations bound the set of configurations the system should be able to cope with, and that should be observed during testing. For example, testing of an autonomous vehicle would be required to navigate different types of road intersection (components) with various shapes (variations), combined with the type of vehicles it might encounter at the intersection, their direction of travel, and any other reasonable factor. 

The Safety Analysis [2] provides for the identification of relevant situations, by identifying potential hazards, and the causes leading to these events. Our intuition is that monitoring for such situations during testing can help identify unsafe spots in the explored space of configurations, and guide automated testing techniques towards regions more likely to result in hazards. We further consider situations and their constituent components to assess the coverage of the situation space achieved during testing, where high coverage speaks for the quality of the testing strategy and provides an indicator that the system has been considered in a variety of contexts.

Our work is encapsulated in an experimental Python framework [3] which provides primitives to interact with the Digital Twin. The framework allows for running generated or hand-crafted system configurations, and assessing the occurrence of various situations. Safety properties are expressed using temporal logic to capture the interleaving of events over time, and can be independently verified using the Python environment. Our contribution thus provides an automated testing environment, but also offers the tools to verify properties related to security or control in the explored systems.

[1] Villani, V., Pini, F., Leali, F., Secchi, C.: Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics55,248–266 (2018)
[2] AAIP Body of Knowledge: 1.2.1 Considering human/machine interactions. accessed: 2021-02
91Ö±²¥ Robotics

Contact us

Centres of excellence

The University's cross-faculty research centres harness our interdisciplinary expertise to solve the world's most pressing challenges.