A Comparative Study of Neural Network Architecture for Stock Price Forecasting
DOI:
https://doi.org/10.54097/q4r00490Keywords:
Stock Price Forecasting; Financial Forecasting; Neural Network; CNN-LSTM Hybrid Model.Abstract
Stock prices pose significant challenges in the field of financial forecasting, being influenced by market noise and nonlinear factors. A CNN-LSTM hybrid model, Transformer, multi-layer perceptron (MLP), and long short-term memory network (LSTM) are used in this study to forecast Apple Inc. (AAPL) stock prices over a five-year historical period with accuracy. The data is preprocessed using Min-Max normalization and a 60-day sliding window. The model is trained in the PyTorch framework and optimized using the mean squared error (MSE) loss function. The CNN-LSTM model outperforms the other models in terms of MAE, MSE, RMSE, and all four indices, according to the experimental data. Compared to the baseline model (MLP), the CNN-LSTM model reduces the MSE by approximately 46%. This study validates the CNN-LSTM hybrid model's superiority in time series analysis, providing an efficient tool for financial decision-making and laying a solid foundation for future research on integrating external factors (such as market sentiment and macroeconomic indicators).
References
[1]D.J. Bartholomew, G.E.P. Box, G.M. Jenkins, Time series analysis forecasting and control. Oper. Res. Q. 22, 199 (1971).
[2]T. Bollerslev, A conditionally heteroskedastic time series model for speculative prices and rates of return. Rev. Econ. Stat. 69, 542 (1987).
[3]S. Siami-Namini, N. Tavakoli, A.S. Namin, The performance of LSTM and BILSTM in forecasting time series, in Proceedings of the IEEE International Conference on Big Data (Big Data) (2019), pp. 3285–3292.
[4]Y.A. LeCun, L. Bottou, G.B. Orr, K. Müller, Efficient BackProp, in Lecture Notes in Computer Science (Springer, 2012), pp. 9–48.
[5]J.W. Wilder, New concepts in technical trading systems (Trend Research, 1978) http://dspace.lib.uom.gr/handle/2159/29408
[6]H. Hassani, M.R. Yeganegi, Selecting optimal lag order in Ljung–Box test. Physica A 541, 123700 (2019).
[7]Suárez-Paniagua, V., VSP at eHealth-KD Challenge 2019: Recurrent Neural Networks for Relation Classification in Spanish eHealth Documents.IberLEF@SEPLN, 95–104. http://ceur-ws.org/Vol-2421/eHealth-KD_paper_10.pdf (2019).
[8]S. Hochreiter, J. Schmidhuber, long short-term memory. Neural Comput. 9, 1735–1780 (1997).
[9]A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A.N. Gomez, Ł. Kaiser, I. Polosukhin, Attention is all you need. Adv. Neural Inf. Process. Syst. 30, 5998–6008 (2017).
[10]W. Bao, J. Yue, Y. Rao, A deep learning framework for financial time series using stacked autoencoders and long-short term memory. PLoS ONE 12, e0180944 (2017)
[11]R.J. Hyndman, G. Athanasopoulos, Forecasting: Principles and Practice, 2nd ed. (OTexts, 2018) https://otexts.com/fpp2/
[12]D.M.Q. Nelson, A.C.M. Pereira, R.A. De Oliveira, Stock market’s price movement prediction with LSTM neural networks, in Proceedings of the International Joint Conference on Neural Networks (IJCNN) (2017), pp. 1419–1426.
[13]F.A. Gers, J. Schmidhuber, F. Cummins, learning to forget: continual prediction with LSTM. Neural Comput. 12, 2451–2471 (2000).
[14]S. Selvin, R. Vinayakumar, E.A. Gopalakrishnan, V.K. Menon, K.P. Soman, Stock price prediction using LSTM, RNN and CNN-sliding window model, in Proceedings of the International Conference on Advances in Computing, Communications and Informatics (ICACCI) (2017), pp. 1643–1647.
Downloads
Published
Issue
Section
License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.







