Dimensionality reduction for heterogeneous dataset in rushes editing
作者:
Highlights:
•
摘要
Rushes editing enables the computer to edit the film like a professional film cutter based on the raw footage. The most important issue in rushes editing is the generation of the effective, efficient, and robust descriptors for footage content analysis. Dimensionality reduction technology provides the means to generate such descriptors by seeking a low-dimensional equivalence of the high-dimensional video data using intelligent algorithms. However, existing dimensionality reduction techniques are not directly applicable to the editing of rushes because of the heterogeneity of rushes data. To deal with this heterogeneity, this paper proposes a novel non-linear dimensionality reduction algorithm called multi-layer isometric feature mapping (ML-Isomap). First, a clustering algorithm is utilized to partition the high-dimensional data points into a set of data blocks in the high-dimensional feature space. Second, intra-cluster graphs are constructed based on the individual character of each data block to build the basic layer for the ML-Isomap. Third, the inter-cluster graph is constructed by analyzing the interrelation among these isolated data blocks to build the hyper-layers for the ML-Isomap. Finally, all the data points are mapped into the unique low-dimensional feature space by maintaining to the greatest extent the corresponding relations of the multiple layers in the high-dimensional feature space. Comparative experiments on synthetic data as well as real rushes editing tasks demonstrate that the proposed algorithm can reduce the dimensions of various datasets efficiently while preserving both the global structure and the local details of the heterogeneous dataset.
论文关键词:Dimensionality reduction,Rushes editing,Manifold learning,Isometric feature mapping,Multi-layer Isometric feature mapping
论文评审过程:Received 2 January 2008, Revised 31 May 2008, Accepted 16 June 2008, Available online 24 June 2008.
论文官网地址:https://doi.org/10.1016/j.patcog.2008.06.016