Citation

Yang X, Rahmani H, Black S, Williams BM. Weakly Supervised Co-training with Swapping Assignments for Semantic Segmentation. arXiv preprint arXiv:2402.17891. 2024 Feb 27.

@article{yang2024weakly,
  title={Weakly supervised co-training with swapping assignments for semantic segmentation},
  author={Yang, Xinyu and Rahmani, Hossein and Black, Sue and Williams, Bryan M},
  journal={arXiv preprint arXiv:2402.17891},
  year={2024}
}

Abstract

Class activation maps (CAMs) are commonly employed in weakly supervised semantic segmentation (WSSS) to produce pseudo-labels. Due to incomplete or excessive class activation, existing studies often resort to offline CAM refinement, introducing additional stages or proposing offline modules. This can cause optimization difficulties for single-stage methods and limit generalizability. In this study, we aim to reduce the observed CAM inconsistency and error to mitigate reliance on refinement processes. We propose an end-to-end WSSS model incorporating guided CAMs, wherein our segmentation model is trained while concurrently optimizing CAMs online. Our method, Co-training with Swapping Assignments (CoSA), leverages a dual-stream framework, where one sub-network learns from the swapped assignments generated by the other. We introduce three techniques in this framework: i) soft perplexity-based regularization to penalize uncertain regions; ii) a threshold-searching approach to dynamically revise the confidence threshold; and iii) contrastive separation to address the coexistence problem. CoSA demonstrates exceptional performance, achieving mIoU of 76.2% and 51.0% on VOC and COCO validation datasets, respectively, surpassing existing baselines by a substantial margin. Notably, CoSA is the first single-stage approach to outperform all existing multi-stage methods including those with additional supervision. Source code is publicly available here.

Results

Comparing CoSA with MCTToCo and BECO on COCO:

Comparing CoSA with MCTToCo and BECO on VOC: