Perceptually-guided deep neural networks for ego-action prediction: Object grasping

作者:

Highlights:

• Biologically inspired models for grasping action prediction.

• Gaze-driven detection model for objects to be grasped in egocentric video.

• Two alternative methods for noise handling in eye-gaze measurements.

• A novel loss for automatic prediction of grasping actions.

• A new public dataset for prediction of grasping actions in ego-centric video.

摘要

•Biologically inspired models for grasping action prediction.•Gaze-driven detection model for objects to be grasped in egocentric video.•Two alternative methods for noise handling in eye-gaze measurements.•A novel loss for automatic prediction of grasping actions.•A new public dataset for prediction of grasping actions in ego-centric video.

论文关键词:Human perception,Grasping action prediction,Weakly supervised active object detection

论文评审过程:Received 31 March 2018, Revised 16 October 2018, Accepted 17 November 2018, Available online 17 November 2018, Version of Record 24 November 2018.

论文官网地址:https://doi.org/10.1016/j.patcog.2018.11.013