HEFT: A History-Enhanced Feature Transfer framework for incremental event detection
作者:
Highlights:
•
摘要
Incremental event detection is a challenging subfield of event detection, which could be applied in various complex situations, especially for the real-time streaming data scenario where new event classes are continuously emerging. To handle the catastrophic forgetting problem, knowledge distillation has been proved to be a promising mechanism since it requires no extra tools or data storage. However, conventional knowledge distillation-based methods are not effective due to two challenging issues: (a) there is an inherent conflict between reserving history information from old tasks and adjusting to new tasks with shared model output features. (b) the historical category information is not fully utilized, making the model prone to forgetting knowledge on the observed tasks. In this work, we present a History-Enhanced Feature Transfer (HEFT) framework to solve the two challenges mentioned above. For the first challenge, we employ a feature transfer module to separate reserving and adjusting sub-functions by reconstructing the old task features. In this manner, the reconstructed features are responsible for preserving historical knowledge, and the feature extractor with followed classifier only requires to focus on new task classification. For the second issue, a history-enhanced question answering model is introduced to reinforce the ability to memorize historical information, which leverages the trained categories as clues to guide the reasoning process. The experimental results show HEFT outperforms the state-of-the-art model by 6.2% and 9.1% of the whole F1 score on ACE 2005 and TAC KBP 2017 benchmarks, respectively. The ablation study and case study also demonstrate that HEFT could overcome catastrophic forgetting of old class event triggers without heavily relying on historically preserved samples.
论文关键词:Incremental event detection,History-Enhanced Feature Transfer,Knowledge distillation,Catastrophic forgetting
论文评审过程:Received 31 October 2021, Revised 15 July 2022, Accepted 3 August 2022, Available online 11 August 2022, Version of Record 19 August 2022.
论文官网地址:https://doi.org/10.1016/j.knosys.2022.109601