A sparse graph wavelet convolution neural network for video-based person re-identification

作者:

Highlights:

• We propose an adaptive graph generation module to capture the semantic relationship between different patches across frames to solve the short time occlusion and pedestrian misalignment problems in videobased person Re-ID.

• The generated sparse weighted graph only captures the supplementary information between highly correlated patches, avoiding the use of pairwise feature mapping that produces information redundancy.

• We propose a sparse graph wavelet convolution neural network (SGWCNN) framework to model and propagate spatial-temporal relationships between different patches across frames to generate more robust and discriminative features.

• Compared with traditional GCN-based methods, SGWCNN has sparse parameters, efficient algorithm, and higher accuracy over large-scale video-based person Re-ID datasets.

摘要

•We propose an adaptive graph generation module to capture the semantic relationship between different patches across frames to solve the short time occlusion and pedestrian misalignment problems in videobased person Re-ID.•The generated sparse weighted graph only captures the supplementary information between highly correlated patches, avoiding the use of pairwise feature mapping that produces information redundancy.•We propose a sparse graph wavelet convolution neural network (SGWCNN) framework to model and propagate spatial-temporal relationships between different patches across frames to generate more robust and discriminative features.•Compared with traditional GCN-based methods, SGWCNN has sparse parameters, efficient algorithm, and higher accuracy over large-scale video-based person Re-ID datasets.

论文关键词:Video-based person re-identification,Weighted sparse graph,Graph wavelet convolution neural network

论文评审过程:Received 10 July 2021, Revised 15 February 2022, Accepted 8 April 2022, Available online 13 April 2022, Version of Record 20 April 2022.

论文官网地址:https://doi.org/10.1016/j.patcog.2022.108708