Sequential predictions based on algorithmic complexity
作者:
Highlights:
•
摘要
This paper studies sequence prediction based on the monotone Kolmogorov complexity Km=-logm, i.e. based on universal deterministic/one-part MDL. m is extremely close to Solomonoff's universal prior M, the latter being an excellent predictor in deterministic as well as probabilistic environments, where performance is measured in terms of convergence of posteriors or losses. Despite this closeness to M, it is difficult to assess the prediction quality of m, since little is known about the closeness of their posteriors, which are the important quantities for prediction. We show that for deterministic computable environments, the “posterior” and losses of m converge, but rapid convergence could only be shown on-sequence; the off-sequence convergence can be slow. In probabilistic environments, neither the posterior nor the losses converge, in general.
论文关键词:Sequence prediction,Algorithmic information theory,Solomonoff's prior,Monotone Kolmogorov complexity,Minimal description length,Convergence,Self-optimization
论文评审过程:Received 22 October 2003, Revised 1 July 2005, Available online 19 August 2005.
论文官网地址:https://doi.org/10.1016/j.jcss.2005.07.001