Transformer-based Dynamic Fusion Clustering Network
作者:
Highlights:
•
摘要
Clustering is an advanced task in machine learning. Numerous studies have improved clustering performance by integrating deep learning into clustering technology. However, some limitations have still existed in the current deep clustering research: (1) lack of a dynamic fusion mechanism to help multiple deep learning networks jointly train node information (2) data structure embedding methods are not mature enough, and as the number of layers in the deep learning network deepens, the ability to learn data representation decreases, resulting in low performance. In contrast to these clustering methods, we propose a Transformer-based Dynamic Fusion Clustering Network (TDCN), a novel deep clustering network mainly with Transformer architecture that can successfully address the current issues and improve the clustering performance. Specifically, a new dynamic attention mechanism is used to fuse the feature of Transformer and autoencoder (AE) networks in TDCN. In order to obtain data structural information, a new transformation operation G is designed. The transformation operation G varies according to the characteristics of the source data, helping to represent the data structure. In addition, TDCN stacks multi-layer multi-scale heterogeneous networks to learn node representation and further integrates information at different scales through specific modules, so as to facilitate efficient extraction of information. The whole deep clustering network is trained by a dual self-supervision mechanism. Experiments indicate that our model can achieve comparable or even better performance than the state-of-the-art methods on five datasets.
论文关键词:Deep clustering,Dynamic attention mechanism,Transformer network,Self-supervised learning,Feature fusion
论文评审过程:Received 6 April 2022, Revised 15 September 2022, Accepted 4 October 2022, Available online 10 October 2022, Version of Record 22 October 2022.
论文官网地址:https://doi.org/10.1016/j.knosys.2022.109984