Non-iterative Knowledge Fusion in Deep Convolutional Neural Networks

作者:Mikhail Iu. Leontev, Viktoriia Islenteva, Sergey V. Sukhov

摘要

Incorporation of new knowledge into neural networks with simultaneous preservation of the previous knowledge is known to be a nontrivial problem. This problem becomes even more complex when the new knowledge is contained not in new training examples, but inside the parameters (e.g., connection weights) of another neural network. In this correspondence, we propose and test two methods of combining knowledge contained in separate networks. The first method is based on a summation of weights. The second incorporates new knowledge by modification of weights nonessential for the preservation of previously stored information. We show that with these methods, the knowledge can be transferred non-iteratively from one network to another without requiring additional training sessions. The fused network operates efficiently, performing classification at a level similar to that of an ensemble of networks. The efficiency of the methods is quantified on several publicly available data sets in classification tasks both for shallow and deep feedforward neural networks.

论文关键词:Knowledge fusion, Transfer learning, Convolutional neural networks, Non-iterative learning

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-019-10074-0