Login Paper Search My Schedule Paper Index Help

My ICIP 2021 Schedule

Note: Your custom schedule will not be saved unless you create a new account or login to an existing account.
  1. Create a login based on your email (takes less than one minute)
  2. Perform 'Paper Search'
  3. Select papers that you desire to save in your personalized schedule
  4. Click on 'My Schedule' to see the current list of selected papers
  5. Click on 'Printable Version' to create a separate window suitable for printing (the header and menu will appear, but will not actually print)

Paper Detail

Paper IDTEC-6.6
Paper Title PixStabNet: FAST MULTI-SCALE DEEP ONLINE VIDEO STABILIZATION WITH PIXEL-BASED WARPING
Authors Yu-Ta Chen, Kuan-Wei Tseng, National Taiwan University, Taiwan; Yao-Chih Lee, Academia Sinica, Taiwan; Chun-Yu Chen, Yi-Ping Hung, National Taiwan University, Taiwan
SessionTEC-6: Image and Video Processing 2
LocationArea G
Session Time:Monday, 20 September, 15:30 - 17:00
Presentation Time:Monday, 20 September, 15:30 - 17:00
Presentation Poster
Topic Image and Video Processing: Restoration and enhancement
IEEE Xplore Open Preview  Click here to view in IEEE Xplore
Abstract Online video stabilizaton is increasingly needed for real-time applications such as live streaming, drone remote control, and video communication. We propose a multi-scale convolutional neural network (PixStabNet) which stabilizes video in real time without using future frames. Instead of calculating a global homography or multiple homographies, we estimate a pixel-based warping map to make the transformation of each pixel to achieve more precise modelling. In addition, we propose well-designed loss functions along with a two-stage training scheme to enhance network robustness. The quantitative result shows that our method outperforms other learning-based online methods in terms of stability with excellent geometric and temporal consistency. Moreover, to the best of our knowledge, the proposed algorithm is the most efficient approach for video stabilization. The models and results are available at: https://yu-ta-chen.github.io/PixStabNet.