TimeCLR: A self-supervised contrastive learning framework for univariate time series representation

作者:

Highlights:

• We propose a novel method for time series data augmentation named DTW data augmentation that not only generates phase shifts and amplitude changes, but also retains the structure and feature information of the time series.

• We design a feature extractor that can generate time series representations in an end-to-end manner, drawing on the advantages of InceptionTime, the current state-of-the-art deep learning-based time series classification method.

• Combining the advantages of DTW data augmentation and InceptionTime model, we successfully extend SimCLR to the time series field.

摘要

•We propose a novel method for time series data augmentation named DTW data augmentation that not only generates phase shifts and amplitude changes, but also retains the structure and feature information of the time series.•We design a feature extractor that can generate time series representations in an end-to-end manner, drawing on the advantages of InceptionTime, the current state-of-the-art deep learning-based time series classification method.•Combining the advantages of DTW data augmentation and InceptionTime model, we successfully extend SimCLR to the time series field.

论文关键词:Univariate Time series,Representation learning,Self-supervised learning,Contrastive learning,Data augmentation

论文评审过程:Received 28 September 2021, Revised 25 February 2022, Accepted 14 March 2022, Available online 21 March 2022, Version of Record 1 April 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.108606