Unsupervised human activity analysis for intelligent mobile robots
作者:
摘要
The success of intelligent mobile robots operating and collaborating with humans in daily living environments depends on their ability to generalise and learn human movements, and obtain a shared understanding of an observed scene. In this paper we aim to understand human activities being performed in real-world environments from long-term observation from an autonomous mobile robot. For our purposes, a human activity is defined to be a changing spatial configuration of a person's body interacting with key objects that provide some functionality within an environment. To alleviate the perceptual limitations of a mobile robot, restricted by its obscured and incomplete sensory modalities, potentially noisy visual observations are mapped into an abstract qualitative space in order to generalise patterns invariant to exact quantitative positions within the real world. A number of qualitative spatial-temporal representations are used to capture different aspects of the relations between the human subject and their environment. Analogously to information retrieval on text corpora, a generative probabilistic technique is used to recover latent, semantically-meaningful concepts in the encoded observations in an unsupervised manner. The small number of concepts discovered are considered as human activity classes, granting the robot a low-dimensional understanding of visually observed complex scenes. Finally, variational inference is used to facilitate incremental and continuous updating of such concepts that allows the mobile robot to efficiently learn and update its models of human activity over time resulting in efficient life-long learning.
论文关键词:Human activity analysis,Mobile robotics,Qualitative spatio-temporal representation,Low-rank approximations,Probabilistic machine learning,Latent Dirichlet allocation
论文评审过程:Received 10 April 2018, Revised 11 September 2018, Accepted 9 December 2018, Available online 7 January 2019, Version of Record 22 January 2019.
论文官网地址:https://doi.org/10.1016/j.artint.2018.12.005