Login Paper Search My Schedule Paper Index Help

My ICIP 2021 Schedule

Note: Your custom schedule will not be saved unless you create a new account or login to an existing account.
  1. Create a login based on your email (takes less than one minute)
  2. Perform 'Paper Search'
  3. Select papers that you desire to save in your personalized schedule
  4. Click on 'My Schedule' to see the current list of selected papers
  5. Click on 'Printable Version' to create a separate window suitable for printing (the header and menu will appear, but will not actually print)

Paper Detail

Paper IDMLR-APPL-IVSMR-3.12
Paper Title EDGE-LABELING BASED DIRECTED GATED GRAPH NETWORK FOR FEW-SHOT LEARNING
Authors Peixiao Zheng, Xin Guo, Lin Qi, Zhengzhou University, China
SessionMLR-APPL-IVSMR-3: Machine learning for image and video sensing, modeling and representation 3
LocationArea D
Session Time:Wednesday, 22 September, 14:30 - 16:00
Presentation Time:Wednesday, 22 September, 14:30 - 16:00
Presentation Poster
Topic Applications of Machine Learning: Machine learning for image & video sensing, modeling, and representation
IEEE Xplore Open Preview  Click here to view in IEEE Xplore
Abstract Existing graph-network-based few-shot learning methods obtain similarity between nodes through a convolution neural network (CNN). However, the CNN is designed for image data with spatial information rather than vector form node feature. In this paper, we proposed an edge-labeling-based directed gated graph network (DGGN) for few-shot learning, which utilizes gated recurrent units to implicitly update the similarity between nodes. DGGN is composed of a gated node aggregation module and an improved gated recurrent unit (GRU) based edge update module. Specifically, the node update module adopts a gate mechanism using activation of edge feature, making a learnable node aggregation process. Besides, improved GRU cells are employed in the edge update procedure to compute the similarity between nodes. Further, this mechanism is beneficial to gradient backpropagation through the GRU sequence across layers. Experiment results conducted on two benchmark datasets show that our DGGN achieves a comparable performance to the-state-of-art methods.