Structure-aware human pose estimation with graph convolutional networks
作者:
Highlights:
•
摘要
Human pose estimation is the task of localizing body key points from still images. As body key points are inter-connected, it is desirable to model the structural relationships between body key points to further improve the localization performance. In this paper, based on original graph convolutional networks, we propose a novel model, termed Pose Graph Convolutional Network (PGCN), to exploit these important relationships for pose estimation. Specifically, our model builds a directed graph between body key points according to the natural compositional model of a human body. Each node (key point) is represented by a 3-D tensor consisting of multiple feature maps, initially generated by our backbone network, to retain accurate spatial information. Furthermore, attention mechanism is presented to focus on crucial edges (structured information) between key points. PGCN is then learned to map the graph into a set of structure-aware key point representations which encode both structure of human body and appearance information of specific key points. Additionally, we propose two modules for PGCN, i.e., the Local PGCN (L-PGCN) module and Non-Local PGCN (NL-PGCN) module. The former utilizes spatial attention to capture the correlations between the local areas of adjacent key points to refine the location of key points. While the latter captures long-range relationships via non-local operation to associate the challenging key points. By equipping with these two modules, our PGCN can further improve localization performance. Experiments both on single- and multi-person estimation benchmark datasets show that our method consistently outperforms competing state-of-the-art methods.
论文关键词:Human pose estimation,Graph convolutional networks,Key points structural relations
论文评审过程:Received 28 March 2019, Revised 19 December 2019, Accepted 29 April 2020, Available online 16 May 2020, Version of Record 24 May 2020.
论文官网地址:https://doi.org/10.1016/j.patcog.2020.107410