The Internet of Things (IoT) is being used to monitor a wide range of physical phenomena. In this paper, we are concerned with the extraction of features from the gathered IoT signals and, specifically, with the online estimation of their rate-distortion relationship. This information is in fact key to the configuration and adaptation of data compression and in-network processing protocols and, in turn, is deemed a prime functionality for IoT networks. The point is that lossy compression can be often applied at the sources to save transmission energy, while meeting application requirements in the reconstruction quality. This task is however signal- and time-dependent as different signals are usually characterized by different relations and the signal statistics may also change as a function of time. Here, we first formulate the rate-distortion estimation task, framing it as a classification problem. Hence, we consider the following clustering algorithms from the literature: multilayer perceptron, support vector machine, random forest and linear discriminant analysis, and use them to automatically assess rate-distortion curves in an online fashion and from a small number of signal samples. These algorithms are compared in terms of classification accuracy, training time and memory footprint. Numerical results reveal that, although the problem is inherently complex, the careful combination of feature extraction and classification tools makes it possible to reach high classification accuracies using only a few signal features (e.g., from one to four). The best algorithm (random forest) also entails a short training time and, if properly tuned, has a modest memory footprint.
Automatic Rate-Distortion Classification for the IoT: Towards Signal-Adaptive Network Protocols
Zordan, Davide;Parada, Raul;Rossi, Michele;Zorzi, Michele
2017
Abstract
The Internet of Things (IoT) is being used to monitor a wide range of physical phenomena. In this paper, we are concerned with the extraction of features from the gathered IoT signals and, specifically, with the online estimation of their rate-distortion relationship. This information is in fact key to the configuration and adaptation of data compression and in-network processing protocols and, in turn, is deemed a prime functionality for IoT networks. The point is that lossy compression can be often applied at the sources to save transmission energy, while meeting application requirements in the reconstruction quality. This task is however signal- and time-dependent as different signals are usually characterized by different relations and the signal statistics may also change as a function of time. Here, we first formulate the rate-distortion estimation task, framing it as a classification problem. Hence, we consider the following clustering algorithms from the literature: multilayer perceptron, support vector machine, random forest and linear discriminant analysis, and use them to automatically assess rate-distortion curves in an online fashion and from a small number of signal samples. These algorithms are compared in terms of classification accuracy, training time and memory footprint. Numerical results reveal that, although the problem is inherently complex, the careful combination of feature extraction and classification tools makes it possible to reach high classification accuracies using only a few signal features (e.g., from one to four). The best algorithm (random forest) also entails a short training time and, if properly tuned, has a modest memory footprint.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.