Tracking Context Changes through Meta-Learning
作者:Gerhard Widmer
摘要
The article deals with the problem of learning incrementally (‘on-line’) in domains where the target concepts are context-dependent, so that changes in context can produce more or less radical changes in the associated concepts. In particular, we concentrate on a class of learning tasks where the domain provides explicit clues as to the current context (e.g., attributes with characteristic values). A general two-level learning model is presented that effectively adjusts to changing contexts by trying to detect (via ‘meta-learning’) contextual clues and using this information to focus the learning process. Context learning and detection occur during regular on-line learning, without separate training phases for context recognition. Two operational systems based on this model are presented that differ in the underlying learning algorithm and in the way they use contextual information: METAL(B) combines meta-learning with a Bayesian classifier, while METAL(IB) is based on an instance-based learning algorithm. Experiments with synthetic domains as well as a number of ‘real-world’ problems show that the algorithms are robust in a variety of dimensions, and that meta-learning can produce substantial increases in accuracy over simple object-level learning in situations with changing contexts.
论文关键词:Meta-learning, on-line learning, context dependence, concept drift, transfer
论文评审过程:
论文官网地址:https://doi.org/10.1023/A:1007365809034