Learning in the presence of concept drift and hidden contexts

作者:Gerhard Widmer, Miroslav Kubat

摘要

On-line learning in domains where the target concept depends on some hidden context poses serious problems. A changing context can induce changes in the target concepts, producing what is known as concept drift. We describe a family of learning algorithms that flexibly react to concept drift and can take advantage of situations where contexts reappear. The general approach underlying all these algorithms consists of (1) keeping only a window of currently trusted examples and hypotheses; (2) storing concept descriptions and reusing them when a previous context re-appears; and (3) controlling both of these functions by a heuristic that constantly monitors the system's behavior. The paper reports on experiments that test the systems' perfomance under various conditions such as different levels of noise and different extent and rate of concept drift.

论文关键词:Incremental concept learning, on-line learning, context dependence, concept drift, forgetting

论文评审过程:

论文官网地址:https://doi.org/10.1007/BF00116900