Enhancing Stock Market Predictions with Multi-Feature Time2Vec-Transformer Models

  IJCTT-book-cover
 
         
 
© 2025 by IJCTT Journal
Volume-73 Issue-1
Year of Publication : 2025
Authors : Prakhar Srivastava
DOI :  10.14445/22312803/IJCTT-V73I1P101

How to Cite?

Prakhar Srivastava, "Enhancing Stock Market Predictions with Multi-Feature Time2Vec-Transformer Models," International Journal of Computer Trends and Technology, vol. 73, no. 1, pp. 1-18, 2025. Crossref, https://doi.org/10.14445/22312803/IJCTT-V73I1P101

Abstract
Accurate financial forecasting is inherently challenging due to the complex and dynamic nature of stock price movements, driven by diverse and interrelated factors. Traditional models often fail to capture both short-term volatility and long-term dependencies adequately. This paper introduces a novel financial prediction model that combines the Transformer architecture with Time2Vec, extending existing methodologies by integrating multiple correlated market features. The approach enhances prediction accuracy and reduces errors in extreme market conditions. Extensive experiments conducted on stock indices, including NASDAQ, S&P 500, and Exxon Mobil, demonstrate that the proposed model significantly outperforms traditional methods such as LSTM and RNN. By leveraging correlated market features, the model effectively captures intricate relationships, leading to improved forecasting performance. The study highlights the advantages of incorporating correlation aware features in financial models and discusses potential applications in broader financial markets. These findings pave the way for developing more robust prediction systems, offering valuable insights for investors, analysts, and policymakers in managing financial risks and opportunities.

Keywords
Financial forecasting, Transformer, Time2Vec, Stock price prediction, NASDAQ, S&P 500, Exxon mobil, LSTM, RNN, Correlated market features.

Reference

[1] Lasse Heje Pedersen, Efficiently Inefficient: How Smart Money Invests and Market Prices Are Determined, Princeton University Press, pp. 1-368, 2019.
[Google Scholar] [Publisher Link]
[2] Everette S. Gardner, and E.D. McKenzie, “Forecasting Trends in Time Series,” Management Science, vol. 31, no. 10, pp. 1237-1246, 1985.
[CrossRef] [Google Scholar] [Publisher Link]
[3] Artemios-Anargyros Semenoglou et al., “Investigating the Accuracy of Cross-Learning Time Series Forecasting Methods,” International Journal of Forecasting, vol. 37, no. 3, pp. 1072-1084, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[4] Sepp Hochreiter, and Jürgen Schmidhuber, “Long Short-Term Memory,” Neural Computation, vol. 9, no. 8, pp. 1735-1780, 1997. [CrossRef] [Google Scholar] [Publisher Link]
[5] Shiyang Li et al., “Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting,” Advances in Neural Information Processing Systems 32 (NeurIPS), 2019.
[Google Scholar] [Publisher Link]
[6] Ashish Vaswani et al., “Attention Is All You Need,” ArXiv, pp. 1-15, 2017.
[CrossRef] [Google Scholar] [Publisher Link]
[7] Neural Networks from Scratch, Victor Zhou, 2019. [Online]. Available: victorzhou.com/series/neural-networks-from-scratch/
[8] Souhaib Ben Taieb, Antti Sorjamaa, and Gianluca Bontempi, “Multiple-Output Modeling for Multi-Step-Ahead Time Series Forecasting,” Neurocomputing, vol. 73, no. 10-12, pp. 1950-1957, 2010.
[CrossRef] [Google Scholar] [Publisher Link]
[9] Massimiliano Marcellino, James H. Stock, and Mark W. Watson, “A Comparison of Direct and Iterated Multistep AR Methods for Forecasting Macroeconomic Time Series,” Journal of Econometrics, vol. 135, no. 1-2, pp. 499-526, 2006.
[CrossRef] [Google Scholar] [Publisher Link]
[10] David E. Rumelhart, Geoffrey E. Hinton, and Ronald J. Williams, “Learning Representations by Back-Propagating Errors,” Nature, vol. 323, no. 6088, pp. 533-536, 1986.
[Google Scholar] [Publisher Link]
[11] Tianyu Wang et al., “From Model-Driven to Data-Driven: A Review of Hysteresis Modeling in Structural and Mechanical Systems,” Mechanical Systems and Signal Processing, vol. 204, pp. 1-39, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[12] Understanding LSTM Networks, Colah’s Blog, Github.io, 2015. [Online]. Available: https://colah.github.io/posts/2015-08-Understanding LSTMs/
[13] Jia Wang et al., “Clvsa: A Convolutional Lstm Based Variational Sequence-To-Sequence Model with Attention for Predicting Trends of Financial Markets,” Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence Organization (IJCAI), pp. 3705-3711, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[14] Fuli Feng et al., “Enhancing Stock Movement Prediction with Adversarial Training,” ArXiv, pp. 1-7, 2019.
[CrossRef] [Google Scholar] [Publisher Link] [
15] Seyed Mehran Kazemi et al., “Time2vec: Learning A Vector Representation of Time,” ArXiv, pp. 1-16, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[16] Jacob Devlin et al., “BERT: Pretraining of Deep Bidirectional Transformers for Language Understanding,” Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 4171 4186, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[17] Qingsong Wen et al., “Transformers in Time Series: A Survey,” ArXiv, pp. 1-9, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[18] RMSProp - Cornell University Computational Optimization Open Textbook - Optimization Wiki, Cornell.edu, 2022. [Online]. Available: https://optimization.cbe.cornell.edu/index.php?title=RMSProp
[19] AdaGrad - Cornell University Computational Optimization Open Textbook - Optimization Wiki, Cornell.edu, 2022. [Online]. Available: https://optimization.cbe.cornell.edu/index.php?title=AdaGrad
[20] Jian-Hang Li et al., “Multi-Head Attention Based Hybrid Deep Neural Network for Aeroengine Risk Assessment,” IEEE Access, vol. 11, pp. 113376-113389, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[21] Mandella Ali M. Fargalla et al., “Timenet: Time2vec Attention-Based Cnn-Bigru Neural Network for Predicting Production in Shale and Sandstone Gas Reservoirs,” Energy, vol. 290, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[22] Shunrong Shen, Haomiao Jiang, and Tongda Zhang, “Stock Market Forecasting Using Machine Learning Algorithms,” Department of Electrical Engineering, Stanford University, pp. 1-5, 2012.
[Google Scholar] [Publisher Link]
[23] Yahoo Finance, Yahoo, 2024. [Online]. Available: https://finance.yahoo.com/
[24] Residual Neural Network, Wikipedia.org, 2017. [Online]. Available: https://en.wikipedia.org/wiki/Residual_neural_network