Class-prior estimation for learning from positive and unlabeled data
作者:Marthinus C. du Plessis, Gang Niu, Masashi Sugiyama
摘要
We consider the problem of estimating the class prior in an unlabeled dataset. Under the assumption that an additional labeled dataset is available, the class prior can be estimated by fitting a mixture of class-wise data distributions to the unlabeled data distribution. However, in practice, such an additional labeled dataset is often not available. In this paper, we show that, with additional samples coming only from the positive class, the class prior of the unlabeled dataset can be estimated correctly. Our key idea is to use properly penalized divergences for model fitting to cancel the error caused by the absence of negative samples. We further show that the use of the penalized \(L_1\)-distance gives a computationally efficient algorithm with an analytic solution. The consistency, stability, and estimation error are theoretically analyzed. Finally, we experimentally demonstrate the usefulness of the proposed method.
论文关键词:Class-prior estimation, Positive and unlabeled learning
论文评审过程:
论文官网地址:https://doi.org/10.1007/s10994-016-5604-6