A Brain-Inspired Method of Facial Expression Generation Using Chaotic Feature Extracting Bidirectional Associative Memory

作者:Isar Nejadgholi, Seyyed Ali SeyyedSalehi, Sylvain Chartier

摘要

Human cognitive system adapts many different environments by exhibiting a broad range of behaviors according to the context. These behaviors vary from general abstractions referred as prototypes to specific perceptual patterns referred as exemplars. A chaotic feature extracting associative memory is proposed to mimic human brain in generating prototype and exemplar facial expressions. This model automatically extracts features of each category of images related to a specific subject and expression. In the training phase, the features are extracted as fixed points. In recall phase, the output attractor of the network ranges from fixed point which results in a prototype facial image, to chaotic attractors which lead to generating exemplar faces. The generative model is applied to enrich a facial image dataset in terms of variability by generating various virtual patterns, in case that only one image per subject is provided. A face recognition task is implemented to compare the enriched and original dataset in training classifiers. Our results show that recognition accuracy increases from 32 to 100% when exemplars generated by the proposed model are used to enrich the training dataset.

论文关键词:Neural networks, Chaos theory, Facial expression, Feature extraction, Virtual pattern generation

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-017-9615-5