Trajectory aligned features for first person action recognition
作者:
Highlights:
• We propose a novel and simple representation of the first person actions.
• The features are simple to compute feature trajectories.
• Our approach does not assume hand or object segmentation and pose.
• Our technique results in improvement of more than 11% on publicly available datasets.
• Our method can recognize wearer's actions when hands and objects are not visible.
摘要
Highlights•We propose a novel and simple representation of the first person actions.•The features are simple to compute feature trajectories.•Our approach does not assume hand or object segmentation and pose.•Our technique results in improvement of more than 11% on publicly available datasets.•Our method can recognize wearer's actions when hands and objects are not visible.
论文关键词:Action and activity recognition,Egocentric vision,Video indexing and analysis,Video segmentation
论文评审过程:Received 21 September 2015, Revised 17 April 2016, Accepted 23 July 2016, Available online 26 August 2016, Version of Record 7 September 2016.
论文官网地址:https://doi.org/10.1016/j.patcog.2016.07.031