Collaborative robots in manufacturing offer significant potential for developing synergies between the skills of the workforce and the capabilities of the robots, increasing efficiency and reducing the psycho-physical effort required from workers. One of the main barriers to the adoption of these systems is the workers' lack of trust in robots, which can negatively affect both performance and well-being. To address this issue, a precise monitoring phase must be conducted with sensors and algorithms that enable data fusion from different sources. Nevertheless, in an operative context, the presence of cumbersome setups to monitor both workers and cobots can slow down performance and create bias and unsatisfactory working conditions. For this reason, this work proposes a framework that describes the transition from an integrated Human Digital Twin toward a lighter monitoring setup, exploiting the potential of Machine Learning algorithms during the operative phase to reduce the number of required sensors. In fact, while extensive data is valuable during the design phase of collaborative workstations, the operational phase should minimize sensor use, leveraging pre-gathered data to train Machine Learning networks for estimating missing quantities. To achieve such a level of data quality in the pre-deployment and design phases, we introduce an algorithm for real-time alignment of body poses estimated by different Motion Capture technologies. This method provides accurate, occlusion-robust body pose estimation while also solving the drifting phenomena that affect inertial measurement units. Consequently, the proposed approach establishes a robust foundation for enhancing Human-Robot Collaboration by ensuring precise and reliable real-time body pose estimation, a crucial step for advancing safety and efficiency in the manufacturing field.

Trust the Robot! Enabling Flexible Collaboration With Humans via Multi-Sensor Data Integration

Guidolin M.
;
Berti N.;Gnesotto P.;Battini D.;Reggiani M.
2024

Abstract

Collaborative robots in manufacturing offer significant potential for developing synergies between the skills of the workforce and the capabilities of the robots, increasing efficiency and reducing the psycho-physical effort required from workers. One of the main barriers to the adoption of these systems is the workers' lack of trust in robots, which can negatively affect both performance and well-being. To address this issue, a precise monitoring phase must be conducted with sensors and algorithms that enable data fusion from different sources. Nevertheless, in an operative context, the presence of cumbersome setups to monitor both workers and cobots can slow down performance and create bias and unsatisfactory working conditions. For this reason, this work proposes a framework that describes the transition from an integrated Human Digital Twin toward a lighter monitoring setup, exploiting the potential of Machine Learning algorithms during the operative phase to reduce the number of required sensors. In fact, while extensive data is valuable during the design phase of collaborative workstations, the operational phase should minimize sensor use, leveraging pre-gathered data to train Machine Learning networks for estimating missing quantities. To achieve such a level of data quality in the pre-deployment and design phases, we introduce an algorithm for real-time alignment of body poses estimated by different Motion Capture technologies. This method provides accurate, occlusion-robust body pose estimation while also solving the drifting phenomena that affect inertial measurement units. Consequently, the proposed approach establishes a robust foundation for enhancing Human-Robot Collaboration by ensuring precise and reliable real-time body pose estimation, a crucial step for advancing safety and efficiency in the manufacturing field.
2024
2024 IEEE 29th International Conference on Emerging Technologies and Factory Automation (ETFA)
IEEE 29th International Conference on Emerging Technologies and Factory Automation (ETFA)
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3539322
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex ND
social impact