VR-PROUD: Vehicle Re-identification using PROgressive Unsupervised Deep architecture
作者:
Highlights:
•
摘要
Vehicle re-identification (Re-ID) is one of the primary components of an automated visual surveillance system. It aims to automatically identify/search vehicles in a multi-camera network usually having non-overlapping field-of-views. Majority of the approaches dealing with the re-ID problem tackle it in a supervised manner which have certain limitations that pose challenges of generalization e.g., large amount of annotated data is required for training and is often limited to the dynamic growth of the data. Unsupervised learning techniques can potentially cope with such issues by drawing inference directly from the unlabeled input data and have been effectively employed in the context of person re-ID. To this end, this paper presents an approach that essentially formulates the whole vehicle re-ID problem into an unsupervised learning paradigm using a progressive two step cascaded framework. It combines a CNN architecture for feature extraction and an unsupervised technique to enable self-paced progressive learning. It also incorporates the contextual information into the proposed progressive framework that significantly improves the convergence of the learned algorithm. Moreover, the approach is generic and has been the first attempt to tackle the vehicle re-ID problem in an unsupervised manner. The performance of the proposed algorithm has been thoroughly analyzed over two large publically available benchmark datasets VeRi and VehicleID for vehicle re-ID using image-to-image and cross-camera search strategies and achieved better performance in comparison to current state-of-the-art approaches using standard evaluation metrics.
论文关键词:Vehicle re-id,Deep learning,Unsupervised,Clustering,Visual surveillance,Progressive learning,Self pace
论文评审过程:Received 1 June 2018, Revised 14 December 2018, Accepted 4 January 2019, Available online 15 January 2019, Version of Record 18 January 2019.
论文官网地址:https://doi.org/10.1016/j.patcog.2019.01.008