Predictive complexity and information
作者:
Highlights:
•
摘要
The notions of predictive complexity and of corresponding amount of information are considered. Predictive complexity is a generalization of Kolmogorov complexity which bounds the ability of any algorithm to predict elements of a sequence of outcomes. We consider predictive complexity for a wide class of bounded loss functions which are generalizations of square-loss function. Relations between unconditional KG(x) and conditional KG(x|y) predictive complexities are studied. We define an algorithm which has some “expanding property”. It transforms with positive probability sequences of given predictive complexity into sequences of essentially bigger predictive complexity. A concept of amount of predictive information IG(y:x) is studied. We show that this information is noncommutative in a very strong sense and present asymptotic relations between values IG(y:x), IG(x:y), KG(x) and KG(y).
论文关键词:Machine learning,Algorithmic prediction,Predictive complexity,Predictive information,Loss functions,Kolmogorov complexity,Expanding property
论文评审过程:Received 8 January 2003, Revised 13 May 2003, Available online 1 December 2004.
论文官网地址:https://doi.org/10.1016/j.jcss.2004.10.005