The current fourth industrial revolution significantly impacts on production processes. The personalized production paradigm which distinguishes Industry 4.0 enables customers to order unique products, defined by the specific features selected. The operators involved in the manual assembly of such workpieces have to process an enormous component variety adapting their tasks from product to product with limited learning opportunities. On the other hand, digital technologies significantly evolved in the last decade and their adoption in industrial shop floors in increasingly wider. In particular, camera-based marker-less motion capture achieved a large popularity since it represents a cheap, reliable and non-invasive solution to track, trace and digitalize human movements in different environments. Considering the presented framework, this research proposes an original hardware/software architecture to assist in real-time operators involved in manual assembly processes during the training phase to support their learning process, both in terms of rate and quality. A marker-less depth camera captures human motions in relation with the workstation environment whereas an augmented reality application based on visual feedback guides the operator through consecutive assembly tasks during the training phase. An experimental campaign is performed at the Learning factory of the Digital production university laboratory to validate the proposed architecture compared to traditional paper-based instructions provided for trainings. A real industrial case study is adopted to test and quantitatively evaluate the benefits of the developed technology compared to traditional approach in terms learning rate, which increases by 22% with a reduction in manual process duration up to -51% during the first assembly cycles.

Learning manual assembly through real-time motion capture for operator training with augmented reality

Pilati F.;Faccio M.;Gamberi M.;
2020

Abstract

The current fourth industrial revolution significantly impacts on production processes. The personalized production paradigm which distinguishes Industry 4.0 enables customers to order unique products, defined by the specific features selected. The operators involved in the manual assembly of such workpieces have to process an enormous component variety adapting their tasks from product to product with limited learning opportunities. On the other hand, digital technologies significantly evolved in the last decade and their adoption in industrial shop floors in increasingly wider. In particular, camera-based marker-less motion capture achieved a large popularity since it represents a cheap, reliable and non-invasive solution to track, trace and digitalize human movements in different environments. Considering the presented framework, this research proposes an original hardware/software architecture to assist in real-time operators involved in manual assembly processes during the training phase to support their learning process, both in terms of rate and quality. A marker-less depth camera captures human motions in relation with the workstation environment whereas an augmented reality application based on visual feedback guides the operator through consecutive assembly tasks during the training phase. An experimental campaign is performed at the Learning factory of the Digital production university laboratory to validate the proposed architecture compared to traditional paper-based instructions provided for trainings. A real industrial case study is adopted to test and quantitatively evaluate the benefits of the developed technology compared to traditional approach in terms learning rate, which increases by 22% with a reduction in manual process duration up to -51% during the first assembly cycles.
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3381041
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 49
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex ND
social impact