Training classification models in the medical domain is often difficult due to data heterogeneity (related to acquisition procedures) and due to the difficulty of getting sufficient amounts of annotations from specialized experts. It is particularly true in digital pathology, where models do not generalize easily. This paper presents a novel approach for the generalization of models in conditions where heterogeneity is high and annotations are few. The approach relies on a semi-supervised teacher/student paradigm to different datasets and annotations. The paradigm combines a small amount of strongly-annotated data, with a large amount of unlabeled data, for training two Convolutional Neural Networks (CNN): the teacher and the student model. The teacher model is trained with strong labels and used to generate pseudo-labeled samples from the unlabeled data. The student model is trained combining the pseudo-labeled samples and a small amount of strongly-annotated data. The paradigm is evaluated on the student model performance of Gleason pattern and Gleason score classification in prostate cancer images and compared with a fully-supervised learning approach for training the student model. In order to evaluate the capability of the approach to generalize, the datasets used are highly heterogeneous in visual characteristics and are collected from two different medical institutions. The models, trained with the teacher/student paradigm, show an improvement in performance above the fully-supervised training. The models generalize better on both the datasets, despite the inter-datasets heterogeneity, alleviating the overfitting. The classification performance shows an improvement both in the classification of Gleason pattern at patch level (κ=0.6129±0.0127 from κ=0.5608±0.0308 ) and at in Gleason score classification, evaluated at WSI-level κ=0.4477±0.0460 from κ=0.2814±0.1312 ).
Semi-supervised Learning with a Teacher-Student Paradigm for Histopathology Classification: A Resource to Face Data Heterogeneity and Lack of Local Annotations
Marini N.;Atzori M.
2021
Abstract
Training classification models in the medical domain is often difficult due to data heterogeneity (related to acquisition procedures) and due to the difficulty of getting sufficient amounts of annotations from specialized experts. It is particularly true in digital pathology, where models do not generalize easily. This paper presents a novel approach for the generalization of models in conditions where heterogeneity is high and annotations are few. The approach relies on a semi-supervised teacher/student paradigm to different datasets and annotations. The paradigm combines a small amount of strongly-annotated data, with a large amount of unlabeled data, for training two Convolutional Neural Networks (CNN): the teacher and the student model. The teacher model is trained with strong labels and used to generate pseudo-labeled samples from the unlabeled data. The student model is trained combining the pseudo-labeled samples and a small amount of strongly-annotated data. The paradigm is evaluated on the student model performance of Gleason pattern and Gleason score classification in prostate cancer images and compared with a fully-supervised learning approach for training the student model. In order to evaluate the capability of the approach to generalize, the datasets used are highly heterogeneous in visual characteristics and are collected from two different medical institutions. The models, trained with the teacher/student paradigm, show an improvement in performance above the fully-supervised training. The models generalize better on both the datasets, despite the inter-datasets heterogeneity, alleviating the overfitting. The classification performance shows an improvement both in the classification of Gleason pattern at patch level (κ=0.6129±0.0127 from κ=0.5608±0.0308 ) and at in Gleason score classification, evaluated at WSI-level κ=0.4477±0.0460 from κ=0.2814±0.1312 ).Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.