Login Paper Search My Schedule Paper Index Help

My ICIP 2021 Schedule

Note: Your custom schedule will not be saved unless you create a new account or login to an existing account.
  1. Create a login based on your email (takes less than one minute)
  2. Perform 'Paper Search'
  3. Select papers that you desire to save in your personalized schedule
  4. Click on 'My Schedule' to see the current list of selected papers
  5. Click on 'Printable Version' to create a separate window suitable for printing (the header and menu will appear, but will not actually print)

Paper Detail

Paper IDBIO-1.5
Paper Title RAR-U-NET: A RESIDUAL ENCODER TO ATTENTION DECODER BY RESIDUAL CONNECTIONS FRAMEWORK FOR SPINE SEGMENTATION UNDER NOISY LABELS
Authors Ziyang Wang, Oxford University, United Kingdom; Zhengdong Zhang, Beihang University, China; Irina Voiculescu, Oxford University, United Kingdom
SessionBIO-1: Biomedical Signal Processing 1
LocationArea C
Session Time:Monday, 20 September, 13:30 - 15:00
Presentation Time:Monday, 20 September, 13:30 - 15:00
Presentation Poster
Topic Biomedical Signal Processing: Medical image analysis
IEEE Xplore Open Preview  Click here to view in IEEE Xplore
Abstract Segmentation algorithms for medical images are widely studied for various clinical and research purposes. In this paper, we propose a new and efficient method for medical image segmentation under noisy labels. The method operates under a deep learning paradigm, incorporating four novel contributions. Firstly, a residual interconnection is explored in different scale encoders to transfer gradient information efficiently. Secondly, four copy-and-crop connections are replaced by residual-block-based concatenation to alleviate the disparity between encoders and decoders. Thirdly, convolutional attention modules for feature refinement are studied on all scale decoders. Finally, an adaptive denoising learning strategy (ADL) is introduced into the training process to avoid too much influence from the noisy labels. Experimental results are illustrated on a publicly available benchmark database of spine CTs. Our proposed method achieves competitive performance against other state-of-the-art methods over a variety of different evaluation measures.