Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Jul 14;12(7):e0180944.
doi: 10.1371/journal.pone.0180944. eCollection 2017.

A deep learning framework for financial time series using stacked autoencoders and long-short term memory

Affiliations

A deep learning framework for financial time series using stacked autoencoders and long-short term memory

Wei Bao et al. PLoS One. .

Abstract

The application of deep learning approaches to finance has received a great deal of attention from both investors and researchers. This study presents a novel deep learning framework where wavelet transforms (WT), stacked autoencoders (SAEs) and long-short term memory (LSTM) are combined for stock price forecasting. The SAEs for hierarchically extracted deep features is introduced into stock price forecasting for the first time. The deep learning framework comprises three stages. First, the stock price time series is decomposed by WT to eliminate noise. Second, SAEs is applied to generate deep high-level features for predicting the stock price. Third, high-level denoising features are fed into LSTM to forecast the next day's closing price. Six market indices and their corresponding index futures are chosen to examine the performance of the proposed model. Results show that the proposed model outperforms other similar models in both predictive accuracy and profitability performance.

PubMed Disclaimer

Conflict of interest statement

Competing Interests: The authors have declared that no competing interests exist.

Figures

Fig 1
Fig 1. The flowchart of the proposed deep learning framework for financial time series.
D(j) is the detailed signal at the j-level. S(J) is the coarsest signal at level J. I(t) and O(t) denote the denoised feature and the one-step-ahead output at time step t, respectively. N is the number of delays of LSTM.
Fig 2
Fig 2. The flowchart of the single layer autoencoder.
The model learns a hidden feature a(x) from input x by reconstructing it on x'. Here,W1 and W2 are the weight of t he hidden layer and the reconstruction layer, respectively. b1 and b2 are the bias of the hidden layer and the reconstruction layer, respectively.
Fig 3
Fig 3. Instance of a stacked autoencoders with 5 layers that is trained by 4 autoencoders.
Fig 4
Fig 4. A recurrent neural network and the unfolding architecture.
U, V and W are the weights of the hidden layer, the output layer and the hidden state, respectively.xt and ot are the input vector and output result at time t, respectively.
Fig 5
Fig 5. The architecture of an LSTM memory cell.
Fig 6
Fig 6. The repeating module in an LSTM.
Here,xt and ht are the input vector and output result to the memory cell at time t, respectively. ht is the value of the memory cell. it, ft and ot are values of the input gate, the forget gate and the output gate at time t, respectively. Ct˜ are values of the the candidate state of the memory cell at time t.
Fig 7
Fig 7. Continuous dataset arrangement for training, validating and testing during the whole sample period.
Fig 8
Fig 8. Displays the actual data and the predicted data from the four models for each stock index in Year 1 from 2010.10.01 to 2011.09.30.

References

    1. Wang B, Huang H, Wang X. A novel text mining approach to financial time series forecasting. Neurocomputing. 2012;83(6):136–45.
    1. Guo Z, Wang H, Liu Q, Yang J. A Feature Fusion Based Forecasting Model for Financial Time Series. Plos One. 2014;9(6):172–200. - PMC - PubMed
    1. Cherkassky V. The nature of statistical learning theory: Springer; 1997. - PubMed
    1. Refenes AN, Zapranis A, Francis G. Stock performance modeling using neural networks: A comparative study with regression models. Neural Networks. 1994;7(2):375–88.
    1. Yoon Y, Margavio TM. A Comparison of Discriminant Analysis versus Artificial Neural Networks. Journal of the Operational Research Society. 1993;44(1):51–60.