Spatial-temporal attention-based convolutional network with text and numerical information for stock price prediction

Publisher:
SPRINGER LONDON LTD
Publication Type:
Journal Article
Citation:
Neural Computing and Applications, 2022, 34, (17), pp. 14387-14395
Issue Date:
2022-09-01
Full metadata record
In the financial market, the stock price prediction is a challenging task which is influenced by many factors. These factors include economic change, politics and global events that are usually recorded in text format, such as the daily news. Therefore, we assume that real-world text information can be used to forecast stock market activity. However, only a few works considered both text and numerical information to predict or analyse stock trends. These works used preprocessed text features as the model inputs; therefore, latent information in text may be lost because the relationships between the text and stock price are not considered. In this paper, we propose a fusion network, i.e. a spatial-temporal attention-based convolutional network (STACN) that can leverage the advantages of an attention mechanism, a convolutional neural network and long short-term memory to extract text and numerical information for stock price prediction. Benefiting from the utilisation of an attention mechanism, reliable text features that are highly relevant to stock value can be extracted, which improves the overall model performance. The experimental results on real-world stock data demonstrate that our STACN model and training scheme can handle both text and numerical data and achieve high accuracy on stock regression tasks. The STACN is compared with CNNs and LSTMs with different settings, e.g. a CNN with only stock data, a CNN with only news titles and LSTMs with only stock data. CNNs considering only stock data and news titles have mean squared errors of 28.3935 and 0.1814, respectively. The accuracy of LSTMs is 0.0763. The STACN can achieve an accuracy of 0.0304, outperforming CNNs and LSTMs in stock regression tasks.
Please use this identifier to cite or link to this item: