Login Paper Search My Schedule Paper Index Help

My ICIP 2021 Schedule

Note: Your custom schedule will not be saved unless you create a new account or login to an existing account.
  1. Create a login based on your email (takes less than one minute)
  2. Perform 'Paper Search'
  3. Select papers that you desire to save in your personalized schedule
  4. Click on 'My Schedule' to see the current list of selected papers
  5. Click on 'Printable Version' to create a separate window suitable for printing (the header and menu will appear, but will not actually print)

Paper Detail

Paper IDMLR-APPL-IP-8.7
Paper Title SHUFFLECOUNT: TASK-SPECIFIC KNOWLEDGE DISTILLATION FOR CROWD COUNTING
Authors Minyang Jiang, Jianzhe Lin, Z. Jane Wang, University of British Columbia, Canada
SessionMLR-APPL-IP-8: Machine learning for image processing 8
LocationArea E
Session Time:Wednesday, 22 September, 14:30 - 16:00
Presentation Time:Wednesday, 22 September, 14:30 - 16:00
Presentation Poster
Topic Applications of Machine Learning: Machine learning for image processing
IEEE Xplore Open Preview  Click here to view in IEEE Xplore
Abstract One promising way to improve the performance of a small deep network is knowledge distillation. Performances of smaller student models with fewer parameters and lower computational cost can be comparable to that of larger teacher models in specific computer vision tasks. Knowledge distillation is especially attractive for the high-accuracy real-time crowd counting task in our daily lives, where the computational resource can be limited and the model efficiency is extremely important. In this paper, we propose a novel task-specific knowledge distillation framework for crowd counting, named ShuffleCount. Its main contributions are two-fold: First, different from existing frameworks, our task-specific ShuffleCount effectively learns from the teacher network through hierarchic feature regulation, and better avoids negative knowledge transferred from the teacher. Second, the proposed student network, i.e., the optimized Shufflenet, shows promising performances. When tested on the benchmark dataset Shanghai Tech A, it achieves a 15\% higher accuracy yet keeps low computational cost when compared with the state-of-the-art MobileCount. Our code is available online at https://github.com/JiangMinyang/CC-KD.