FMixCutMatch for semi-supervised deep learning
- PMID: 33217685
- DOI: 10.1016/j.neunet.2020.10.018
FMixCutMatch for semi-supervised deep learning
Abstract
Mixed sample augmentation (MSA) has witnessed great success in the research area of semi-supervised learning (SSL) and is performed by mixing two training samples as an augmentation strategy to effectively smooth the training space. Following the insights on the efficacy of cut-mix in particular, we propose FMixCut, an MSA that combines Fourier space-based data mixing (FMix) and the proposed Fourier space-based data cutting (FCut) for labeled and unlabeled data augmentation. Specifically, for the SSL task, our approach first generates soft pseudo-labels using the model's previous predictions. The model is then trained to penalize the outputs of the FMix-generated samples so that they are consistent with their mixed soft pseudo-labels. In addition, we propose to use FCut, a new Cutout-based data augmentation strategy that adopts the two masked sample pairs from FMix for weighted cross-entropy minimization. Furthermore, by implementing two regularization techniques, namely, batch label distribution entropy maximization and sample confidence entropy minimization, we further boost the training efficiency. Finally, we introduce a dynamic labeled-unlabeled data mixing (DDM) strategy to further accelerate the convergence of the model. Combining the above process, we finally call our SSL approach as "FMixCutMatch", in short FMCmatch. As a result, the proposed FMCmatch achieves state-of-the-art performance on CIFAR-10/100, SVHN and Mini-Imagenet across a variety of SSL conditions with the CNN-13, WRN-28-2 and ResNet-18 networks. In particular, our method achieves a 4.54% test error on CIFAR-10 with 4K labels under the CNN-13 and a 41.25% Top-1 test error on Mini-Imagenet with 10K labels under the ResNet-18. Our codes for reproducing these results are publicly available at https://github.com/biuyq/FMixCutMatch.
Keywords: Mixed sample augmentation; Regularization; Semi-supervised learning; Soft pseudo-labels.
Copyright © 2020 Elsevier Ltd. All rights reserved.
Conflict of interest statement
Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
MeSH terms
LinkOut - more resources
- Full Text Sources
- Other Literature Sources
- Research Materials
- Miscellaneous
 
        