TY - GEN
T1 - Weak relation enforcement for kinematic-informed long-term stock prediction with artificial neural networks
AU - Selitskiy, Stanislav
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2024.
PY - 2024/6/21
Y1 - 2024/6/21
N2 - We propose loss function week enforcement of the velocity relations between time-series points in the Kinematic-Informed artificial Neural Networks (KINN) for long-term stock prediction. Problems of the series volatility, Out-of-Distribution (OOD) test data, and outliers in training data are addressed by (Artificial Neural Networks) ANN’s learning not only future points prediction but also by learning velocity relations between the points, such a way as avoiding unrealistic spurious predictions. The presented loss function penalizes not only errors between predictions and supervised label data, but also errors between the next point prediction and the previous point plus velocity prediction. The loss function is tested on the multiple popular and exotic AR ANN architectures, and around fifteen years of Dow Jones function demonstrated statistically meaningful improvement across the normalization-sensitive activation functions prone to spurious behaviour in the OOD data conditions. Results show that such architecture addresses the issue of the normalization in the auto-regressive models that break the data topology by weakly enforcing the data neighbourhood proximity (relation) preservation during the ANN transformation.
AB - We propose loss function week enforcement of the velocity relations between time-series points in the Kinematic-Informed artificial Neural Networks (KINN) for long-term stock prediction. Problems of the series volatility, Out-of-Distribution (OOD) test data, and outliers in training data are addressed by (Artificial Neural Networks) ANN’s learning not only future points prediction but also by learning velocity relations between the points, such a way as avoiding unrealistic spurious predictions. The presented loss function penalizes not only errors between predictions and supervised label data, but also errors between the next point prediction and the previous point plus velocity prediction. The loss function is tested on the multiple popular and exotic AR ANN architectures, and around fifteen years of Dow Jones function demonstrated statistically meaningful improvement across the normalization-sensitive activation functions prone to spurious behaviour in the OOD data conditions. Results show that such architecture addresses the issue of the normalization in the auto-regressive models that break the data topology by weakly enforcing the data neighbourhood proximity (relation) preservation during the ANN transformation.
KW - Financial series
KW - Graph neural networks
KW - Kinematic-informed neural networks
KW - Physics-aware neural networks
UR - https://www.scopus.com/pages/publications/85199507622
U2 - 10.1007/978-3-031-62269-4_18
DO - 10.1007/978-3-031-62269-4_18
M3 - Conference contribution
AN - SCOPUS:85199507622
SN - 9783031622687
T3 - Lecture Notes in Networks and Systems
SP - 249
EP - 261
BT - Intelligent Computing - Proceedings of the 2024 Computing Conference
A2 - Arai, Kohei
PB - Springer
T2 - Science and Information Conference, SAI 2024
Y2 - 11 July 2024 through 12 July 2024
ER -