Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Apr:123:200-217.
doi: 10.1016/j.isatra.2021.05.026. Epub 2021 May 19.

Causal augmented ConvNet: A temporal memory dilated convolution model for long-sequence time series prediction

Affiliations

Causal augmented ConvNet: A temporal memory dilated convolution model for long-sequence time series prediction

Abiodun Ayodeji et al. ISA Trans. 2022 Apr.

Abstract

A number of deep learning models have been proposed to capture the inherent information in multivariate time series signals. However, most of the existing models are suboptimal, especially for long-sequence time series prediction tasks. This work presents a causal augmented convolution network (CaConvNet) and its application for long-sequence time series prediction. First, the model utilizes dilated convolution with enlarged receptive fields to enhance global feature extraction in time series. Secondly, to effectively capture the long-term dependency and to further extract multiscale features that represent different operating conditions, the model is augmented with a long-short term memory network. Thirdly, the CaConvNet is further optimized with a dynamic hyperparameter search algorithm to reduce uncertainties and the cost of manual hyperparameter selection. Finally, the model is extensively evaluated on a predictive maintenance task using the turbofan aircraft engine run-to-failure prognostic benchmark dataset (C-MAPSS). The performance of the proposed CaConvNet is also compared with four conventional deep learning models and seven different state-of-the-art predictive models. The evaluation metrics show that the proposed CaConvNet outperforms other models in most of the prognostic tasks. Moreover, a comprehensive ablation study is performed to provide insights into the contribution of each sub-structure of the CaConvNet model to the observed performance. The results of the ablation study as well as the performance improvement of CaConvNet are discussed in this paper.

Keywords: Deep learning; Dilated convolution neural network; Predictive maintenance; Remaining useful life; Time series.

PubMed Disclaimer

Conflict of interest statement

Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

LinkOut - more resources