Remote perception of human movements has the potential to revolutionize the way we interact with technology, enabling an unprecedented integration in everybody's daily life. In this panorama, RADAR devices working in the mmWave region of the radio spectrum have sparked great interest from academia to industry, as they combine highly accurate sensing capabilities with appealing properties of mmWaves, such as insensitivity to extreme light conditions and to the presence of dust, smoke, or rain. Moreover, mmWave RADARs raise less privacy concerns than vision-based monitoring systems, as no image of the surroundings is captured. However, commercial mmWave radar devices have limited range (up to 6-8 m) and are subject to occlusion, which may constitute a significant drawback in large, crowded rooms containing furniture and walls. Thus, covering large indoor spaces requires multiple RADARs, with known relative position and orientation and algorithms to combine their output information. In this thesis, we focus on providing practical solutions for the adoption of mmWave RADARs in real-world settings. In particular, we devise algorithms for the automatic deployment of RADAR networks and for their use for human sensing. Initially, we develop a method for the self-calibration of RADAR sensor networks. The problem is to automatically estimate the relative position and orientation of the RADARs to enable data fusion. The proposed solution works by leveraging the trajectories of people moving freely in the common field of view (FoV) of the RADARs, requiring no human intervention. Then, we develop an experimental testbed for the easy deployment and testing of RADAR network algorithms. Subsequently, we tackle the problem of data fusion in the context of mmWave monitoring systems. We address this in three ways: (i) considering a system where multiple RADARs' data are fused to provide a unique and unified people tracking, (ii) contemplating the cooperation of mmWave RADARs with other sensors, and (iii), exploiting communication devices for sensing purposes. In (i), each node of the RADAR network is endowed with resource-constrained computational capabilities, performs people tracking independently, and shares its tracking information with a fusion center that fuses data providing an enhanced, unified tracking among all sensors.In (ii), a thermal camera (TC) is used in conjunction with a mmWave RADAR to provide concurrent contact tracing and body temperature screening. Finally, in (iii), the use of communication devices is considered in an integrated sensing and communication (ISAC) scenario. In the latter, human sensing parameters are extracted from the communication packets exchanged between one transmitter (TX) and multiple receivers (RXs).

Collaborative Human Sensing with mmWave Systems / Canil, Marco. - (2024 Mar 21).

Collaborative Human Sensing with mmWave Systems

CANIL, MARCO
2024

Abstract

Remote perception of human movements has the potential to revolutionize the way we interact with technology, enabling an unprecedented integration in everybody's daily life. In this panorama, RADAR devices working in the mmWave region of the radio spectrum have sparked great interest from academia to industry, as they combine highly accurate sensing capabilities with appealing properties of mmWaves, such as insensitivity to extreme light conditions and to the presence of dust, smoke, or rain. Moreover, mmWave RADARs raise less privacy concerns than vision-based monitoring systems, as no image of the surroundings is captured. However, commercial mmWave radar devices have limited range (up to 6-8 m) and are subject to occlusion, which may constitute a significant drawback in large, crowded rooms containing furniture and walls. Thus, covering large indoor spaces requires multiple RADARs, with known relative position and orientation and algorithms to combine their output information. In this thesis, we focus on providing practical solutions for the adoption of mmWave RADARs in real-world settings. In particular, we devise algorithms for the automatic deployment of RADAR networks and for their use for human sensing. Initially, we develop a method for the self-calibration of RADAR sensor networks. The problem is to automatically estimate the relative position and orientation of the RADARs to enable data fusion. The proposed solution works by leveraging the trajectories of people moving freely in the common field of view (FoV) of the RADARs, requiring no human intervention. Then, we develop an experimental testbed for the easy deployment and testing of RADAR network algorithms. Subsequently, we tackle the problem of data fusion in the context of mmWave monitoring systems. We address this in three ways: (i) considering a system where multiple RADARs' data are fused to provide a unique and unified people tracking, (ii) contemplating the cooperation of mmWave RADARs with other sensors, and (iii), exploiting communication devices for sensing purposes. In (i), each node of the RADAR network is endowed with resource-constrained computational capabilities, performs people tracking independently, and shares its tracking information with a fusion center that fuses data providing an enhanced, unified tracking among all sensors.In (ii), a thermal camera (TC) is used in conjunction with a mmWave RADAR to provide concurrent contact tracing and body temperature screening. Finally, in (iii), the use of communication devices is considered in an integrated sensing and communication (ISAC) scenario. In the latter, human sensing parameters are extracted from the communication packets exchanged between one transmitter (TX) and multiple receivers (RXs).
Collaborative Human Sensing with mmWave Systems
21-mar-2024
Collaborative Human Sensing with mmWave Systems / Canil, Marco. - (2024 Mar 21).
File in questo prodotto:
File Dimensione Formato  
tesi_Marco_Canil.pdf

accesso aperto

Descrizione: tesi_Marco_Canil
Tipologia: Tesi di dottorato
Dimensione 12.34 MB
Formato Adobe PDF
12.34 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3511373
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact