Skip to main content
placeholder image

DC-SSL: Addressing Mismatched Class Distribution in Semi-supervised Learning

Conference Paper


Abstract


  • Consistency-based Semi-supervised learning (SSL) has achieved promising performance recently. However, the success largely depends on the assumption that the labeled and unlabeled data share an identical class distribution, which is hard to meet in real practice. The distribution mismatch between the labeled and unlabeled sets can cause severe bias in the pseudo-labels of SSL, resulting in significant performance degradation. To bridge this gap, we put forward a new SSL learning framework, named Distribution Consistency SSL (DC-SSL), which rectifies the pseudolabels from a distribution perspective. The basic idea is to directly estimate a reference class distribution (RCD), which is regarded as a surrogate of the ground truth class distribution about the unlabeled data, and then improve the pseudo-labels by encouraging the predicted class distribution (PCD) of the unlabeled data to approach RCD gradually. To this end, this paper revisits the Exponentially Moving Average (EMA) model and utilizes it to estimate RCD in an iteratively improved manner, which is achieved with a momentum-update scheme throughout the training procedure. On top of this, two strategies are proposed for RCD to rectify the pseudo-label prediction, respectively. They correspond to an efficient training-free scheme and a training-based alternative that generates more accurate and reliable predictions. DC-SSL is evaluated on multiple SSL benchmarks and demonstrates remarkable performance improvement over competitive methods under matched- and mismatched-distribution scenarios.

Publication Date


  • 2022

Citation


  • Zhao, Z., Zhou, L., Duan, Y., Wang, L., Qi, L., & Shi, Y. (2022). DC-SSL: Addressing Mismatched Class Distribution in Semi-supervised Learning. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Vol. 2022-June (pp. 9747-9755). doi:10.1109/CVPR52688.2022.00953

Scopus Eid


  • 2-s2.0-85136092814

Web Of Science Accession Number


Start Page


  • 9747

End Page


  • 9755

Volume


  • 2022-June

Issue


Place Of Publication


Abstract


  • Consistency-based Semi-supervised learning (SSL) has achieved promising performance recently. However, the success largely depends on the assumption that the labeled and unlabeled data share an identical class distribution, which is hard to meet in real practice. The distribution mismatch between the labeled and unlabeled sets can cause severe bias in the pseudo-labels of SSL, resulting in significant performance degradation. To bridge this gap, we put forward a new SSL learning framework, named Distribution Consistency SSL (DC-SSL), which rectifies the pseudolabels from a distribution perspective. The basic idea is to directly estimate a reference class distribution (RCD), which is regarded as a surrogate of the ground truth class distribution about the unlabeled data, and then improve the pseudo-labels by encouraging the predicted class distribution (PCD) of the unlabeled data to approach RCD gradually. To this end, this paper revisits the Exponentially Moving Average (EMA) model and utilizes it to estimate RCD in an iteratively improved manner, which is achieved with a momentum-update scheme throughout the training procedure. On top of this, two strategies are proposed for RCD to rectify the pseudo-label prediction, respectively. They correspond to an efficient training-free scheme and a training-based alternative that generates more accurate and reliable predictions. DC-SSL is evaluated on multiple SSL benchmarks and demonstrates remarkable performance improvement over competitive methods under matched- and mismatched-distribution scenarios.

Publication Date


  • 2022

Citation


  • Zhao, Z., Zhou, L., Duan, Y., Wang, L., Qi, L., & Shi, Y. (2022). DC-SSL: Addressing Mismatched Class Distribution in Semi-supervised Learning. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Vol. 2022-June (pp. 9747-9755). doi:10.1109/CVPR52688.2022.00953

Scopus Eid


  • 2-s2.0-85136092814

Web Of Science Accession Number


Start Page


  • 9747

End Page


  • 9755

Volume


  • 2022-June

Issue


Place Of Publication