Structural knowledge transfer for learning Sum-Product Networks
作者:
Highlights:
•
摘要
To learn an effective Sum-Product Network (SPN) for probabilistic inference, one needs to have a substantial amount of data. In the case when the training dataset is small, SPN performance can be degraded. In this paper, we investigate how transfer learning can improve a SPN when the number of training examples is limited. In particular, we consider a structural transfer setting where (i) one does not have a source dataset but a source SPN, and (ii) there is some kind of similarity between the source SPN and the target domain. We propose a transfer learning approach called TopTrSPN, utilizing the information of the first layer clusters in the source SPN to learn the first layer of the target SPN. Our approach is motivated by transfer learning characteristics of Convolution Neural Network (CNN) as SPN can be viewed as a probabilistic, general-purpose convolution network. Moreover, since the source SPN may have some distribution differences from the target domain, we perform matching between the two by filtering out inconsistent variables. Empirical results on twenty benchmark datasets show the feasibility of our proposed transfer learning approach for SPN structure learning a target domain. Moreover, our proposed transfer learning approach shows encouraging performance when it is applied to text datasets with different set of variables.
论文关键词:Sum-Product Networks,Transfer learning
论文评审过程:Received 30 August 2016, Revised 31 January 2017, Accepted 3 February 2017, Available online 4 February 2017, Version of Record 27 February 2017.
论文官网地址:https://doi.org/10.1016/j.knosys.2017.02.005