Simplified multilayer graph convolutional networks with dropout
作者:Fei Yang, Huyin Zhang, Shiming Tao
摘要
Graph convolutional networks (GCNs) and their variants are excellent deep learning methods for graph-structured data. Moreover, multilayer GCNs can perform feature smoothing repeatedly, which creates considerable performance improvements. However, they may inherit unnecessary complexity and redundant computation; to make matters worse, they introduce overfitting as the number of layers increases. In this paper, we present simplified multilayer graph convolutional networks with dropout (DGCs), novel neural network architectures that successively perform nonlinearity removal and weight matrix merging between graph conventional layers, leveraging a dropout layer to achieve feature augmentation and effectively reduce overfitting. Under such circumstances, first, we extend a shallow GCN to a multilayer GCN. Then, we reduce the complexity and redundant calculations of the multilayer GCN, while improving its classification performance. Finally, we make DGCs readily applicable to inductive and transductive tasks. Extensive experiments on citation networks and social networks offer evidence that the proposed model matches or outperforms state-of-the-art methods.
论文关键词:Graph convolutional networks, Multilayer, Dropout, Feature augmentation
论文评审过程:
论文官网地址:https://doi.org/10.1007/s10489-021-02617-7