October 4, 2024
September 1, 2024
Zheng, Kaizhong; Yu, Shujian; Li Baojuan; Jenssen, Robert; Chen, Badong
This paper serves as the first empirical study on self-supervised pre-training on partially supervised learning, an emerging yet unexplored learning paradigm with missing annotations. This is particularly important in the medical imaging domain, where label scarcity is the main challenge of practical applications. To promote the awareness of partially supervised learning, we leverage partially supervised multi-label classification on chest X-ray images as an instance task to illustrate the challenges of the problem of interest. Through a series of simulated experiments, the empirical findings validate that solving multiple pretext tasks jointly in the pre-training stage can significantly improve the downstream task performance under the partially supervised setup. Further, we propose a new pretext task, reverse vicinal risk minimization, and demonstrate that it provides a more robust and efficient alternative to existing pretext tasks for the instance task of interest.
An exploratory study of self-supervised pre-training on partially supervised multi-label classification on chest X-ray images
Zheng, Kaizhong; Yu, Shujian; Li Baojuan; Jenssen, Robert; Chen, Badong
Applied Soft Computing, Volume 163 , 111855
September 1, 2024
Zheng, Kaizhong; Yu, Shujian; Li Baojuan; Jenssen, Robert; Chen, Badong
Applied Soft Computing, Volume 163 , 111855
September 1, 2024