Search

Article

x

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

Prediction of chaotic time series using hybrid neural network and attention mechanism

Huang Wei-Jian Li Yong-Tao Huang Yuan

Citation:

Prediction of chaotic time series using hybrid neural network and attention mechanism

Huang Wei-Jian, Li Yong-Tao, Huang Yuan
PDF
HTML
Get Citation
  • Chaotic time series forecasting has been widely used in various domains, and the accurate predicting of the chaotic time series plays a critical role in many public events. Recently, various deep learning algorithms have been used to forecast chaotic time series and achieved good prediction performance. In order to improve the prediction accuracy of chaotic time series, a prediction model (Att-CNN-LSTM) is proposed based on hybrid neural network and attention mechanism. In this paper, the convolutional neural network (CNN) and long short-term memory (LSTM) are used to form a hybrid neural network. In addition, a attention model with softmax activation function is designed to extract the key features. Firstly, phase space reconstruction and data normalization are performed on a chaotic time series, then convolutional neural network (CNN) is used to extract the spatial features of the reconstructed phase space, then the features extracted by CNN are combined with the original chaotic time series, and in the long short-term memory network (LSTM) the combined vector is used to extract the temporal features. And then attention mechanism captures the key spatial-temporal features of chaotic time series. Finally, the prediction results are computed by using spatial-temporal features. To verify the prediction performance of the proposed hybrid model, it is used to predict the Logistic, Lorenz and sunspot chaotic time series. Four kinds of error criteria and model running times are used to evaluate the performance of predictive model. The proposed model is compared with hybrid CNN-LSTM model, the single CNN and LSTM network model and least squares support vector machine(LSSVM), and the experimental results show that the proposed hybrid model has a higher prediction accuracy.
      Corresponding author: Li Yong-Tao, lyotard@163.com
    • Funds: Project supported by the Natural Science Foundation of Hebei Province, China (Grant No. F2015402077) and the Scientific Research Foundation of the Higher Education Institutions of Hebei Province, China (Grant No. QN2018073)
    [1]

    王世元, 史春芬, 钱国兵, 王万里 2018 物理学报 67 018401Google Scholar

    Wang S Y, Shi C F, Qian G B, Wang W L 2018 Acta Phys. Sin. 67 018401Google Scholar

    [2]

    梅英, 谭冠政, 刘振焘, 武鹤 2018 物理学报 67 080502Google Scholar

    Mei Y, Tan G Z, Liu Z T, Wu H 2018 Acta Phys. Sin. 67 080502Google Scholar

    [3]

    沈力华, 陈吉红, 曾志刚, 金健 2018 物理学报 67 030501Google Scholar

    Shen L H, Chen J H, Zeng Z G, Jin J 2018 Acta Phys. Sin. 67 030501Google Scholar

    [4]

    Han M, Zhang S, Xu M, Qiu T, Wang N 2019 IEEE Trans. Cybern. 49 1160Google Scholar

    [5]

    Han M, Zhang R, Qiu T, Xu M, Ren W 2019 IEEE Trans. Syst. Man Cybern 49 2144Google Scholar

    [6]

    Safari N, Chung C Y, Price G C D 2018 IEEE Trans. Power Syst. 33 590Google Scholar

    [7]

    熊有成, 赵鸿 2019 中国科学: 物理学 力学 天文学 49 92

    Xiong Y C, Zhao H 2019 Sci. China, Ser. G 49 92

    [8]

    Sangiorgio M, Dercole F 2020 Chaos, Solitons Fractals 139 110045Google Scholar

    [9]

    李世玺, 孙宪坤, 尹玲, 张仕森 2020 导航定位学报 8 65Google Scholar

    Li S X, Sun X K, Yin L, Zhang S S 2020 Journal of Navigation and Positionina 8 65Google Scholar

    [10]

    Boullé N, Dallas V, Nakatsukasa Y, Samaddar D 2019 Physica D 403 132261Google Scholar

    [11]

    唐舟进, 任峰, 彭涛, 王文博 2014 物理学报 63 050505Google Scholar

    Tang Z J, Ren F, Peng T, Wang W B 2014 Acta Phys. Sin. 63 050505Google Scholar

    [12]

    田中大, 高宪文, 石彤 2014 物理学报 63 160508Google Scholar

    Tian Z D, Gao X W, Shi T 2014 Acta Phys. Sin. 63 160508Google Scholar

    [13]

    王新迎, 韩敏 2015 物理学报 64 070504Google Scholar

    Wang Y X, Han M 2015 Acta Phys. Sin. 64 070504Google Scholar

    [14]

    吕金虎 2002 混沌时间序列分析及其应用 (武汉: 武汉大学出版社) 第57−60页

    Lu J H 2002 Chaotic Time Series Analysis and Application (Wuhan: Wuhan University Press) pp57−60 (in Chinese)

    [15]

    Packard N, Crutchfield J P, Shaw R 1980 Phys. Rev. Lett. 45 712Google Scholar

    [16]

    Takens F 1981 Dynamical Systems and Turbulence (Berlin: Springer) pp366−381

    [17]

    Martinerie J M, Albano A M, Mees A I, Rapp P E 1992 Phys. Rev. A 45 7058Google Scholar

    [18]

    Liangyue C 1997 Physica D 110 43Google Scholar

    [19]

    Lecun Y, Boser B, Denker J, Henderson D, Howard R, Hubbard W, Jackel L 1989 Neural Comput. 1 541Google Scholar

    [20]

    Goodfellow I, Bengio Y, Courville A 2016 Deep Learning (Cambridge: The MIT Press) p326

    [21]

    Kim Y 2014 arXiv: 1408.5882 [cs.CL]

    [22]

    Pascanu R, Mikolov T, Bengio Y 2013 Proceedings of the 30th International Conference on International Conference on Machine Learning Atlanta, USA, June 16−21 2013, p1310

    [23]

    Hochreiter S, Schmidhuber J 1997 Neural Comput. 9 1735Google Scholar

    [24]

    Chung J, Gulcehre C, Cho K, Bengio Y 2014 arXiv: 1412.3555 [cs.NE]

    [25]

    Mnih V, Heess N, Graves A, Kavukcuoglu K 2014 Proceedings of the 27th International Conference on Neural Information Processing Systems Montreal, Canada, December 8−13 2014, p2204

    [26]

    Yin W, Schütze H, Xiang B, Zhou B 2015 arXiv: 1512.05193 [cs.CL]

    [27]

    Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez A N, Kaiser L, Polosukhin I 2017 Proceedings of the 31st International Conference on Neural Information Processing Systems Long Beach, USA, December 4−9 2017, p6000

  • 图 1  数据预处理过程

    Figure 1.  The process of data preprocessing.

    图 2  Att-CNN-LSTM模型

    Figure 2.  Att-CNN-LSTM model.

    图 3  一维卷积网络

    Figure 3.  One dimensional convolutional network.

    图 4  LSTM结构

    Figure 4.  The structure of LSTM.

    图 5  注意力模型

    Figure 5.  Attention model.

    图 6  Logistic序列预测

    Figure 6.  Logistic series prediction.

    图 7  Logistic序列预测相对误差

    Figure 7.  Logistic series prediction relative error.

    图 8  Lorenz(x)序列预测

    Figure 8.  Lorenz(x) series prediction.

    图 9  Lorenz(x)序列预测相对误差

    Figure 9.  Lorenz(x) series prediction relative error.

    图 10  太阳黑子序列预测

    Figure 10.  Sunspot series prediction.

    图 11  太阳黑子序列预测相对误差

    Figure 11.  Sunspot series prediction relative error.

    表 1  模型误差对比

    Table 1.  Model error comparison.

    RMSEMAEMAPERMSPE
    Att-CNN-LSTM0.0035030.0029350.53050.6767
    CNN-LSTM0.0068560.0054441.10641.7795
    LSTM0.0061690.0053161.15951.6887
    CNN0.0046700.0038490.88021.4019
    LSSVM0.0091580.0043071.36233.8604
    DownLoad: CSV

    表 2  模型运行时间对比

    Table 2.  Model running time comparison.

    模型Att-CNN-LSTMCNN-LSTMCNNLSTMLSSVM
    训练时间 /s312.730259.548.8215.4
    预测时间 /s0.530.490.250.210.47
    DownLoad: CSV

    表 3  模型误差对比

    Table 3.  Model error comparison.

    RMSEMAEMAPERMSPE
    Att-CNN-LSTM0.06790.05211.21822.1102
    CNN-LSTM0.24450.12293.884914.7893
    LSTM0.51520.390113.676743.7676
    CNN0.53560.381111.003233.5251
    LSSVM0.51010.36529.354335.5644
    DownLoad: CSV

    表 4  模型运行时间对比

    Table 4.  Model running time comparison.

    模型Att-CNN-LSTMCNN-LSTMCNNLSTMLSSVM
    训练时间 /s184.6193.984.2851.21202.4
    预测时间 /s0.470.550.200.230.45
    DownLoad: CSV

    表 5  模型误差对比

    Table 5.  Model error comparison.

    RMSEMAEMAPERMSPE
    Att-CNN-LSTM20.182917.182732.816742.3529
    CNN-LSTM30.565221.409367.434356.7217
    LSTM24.913718.281549.586262.6939
    CNN24.853418.467765.348044.1892
    LSSVM27.327119.437343.098756.6781
    DownLoad: CSV

    表 6  模型运行时间对比

    Table 6.  Model running time comparison.

    模型Att-CNN-LSTMCNN-LSTMCNNLSTMLSSVM
    训练时间 /s309.2291.376.553.3237.9
    预测时间 /s0.390.430.150.250.59
    DownLoad: CSV
  • [1]

    王世元, 史春芬, 钱国兵, 王万里 2018 物理学报 67 018401Google Scholar

    Wang S Y, Shi C F, Qian G B, Wang W L 2018 Acta Phys. Sin. 67 018401Google Scholar

    [2]

    梅英, 谭冠政, 刘振焘, 武鹤 2018 物理学报 67 080502Google Scholar

    Mei Y, Tan G Z, Liu Z T, Wu H 2018 Acta Phys. Sin. 67 080502Google Scholar

    [3]

    沈力华, 陈吉红, 曾志刚, 金健 2018 物理学报 67 030501Google Scholar

    Shen L H, Chen J H, Zeng Z G, Jin J 2018 Acta Phys. Sin. 67 030501Google Scholar

    [4]

    Han M, Zhang S, Xu M, Qiu T, Wang N 2019 IEEE Trans. Cybern. 49 1160Google Scholar

    [5]

    Han M, Zhang R, Qiu T, Xu M, Ren W 2019 IEEE Trans. Syst. Man Cybern 49 2144Google Scholar

    [6]

    Safari N, Chung C Y, Price G C D 2018 IEEE Trans. Power Syst. 33 590Google Scholar

    [7]

    熊有成, 赵鸿 2019 中国科学: 物理学 力学 天文学 49 92

    Xiong Y C, Zhao H 2019 Sci. China, Ser. G 49 92

    [8]

    Sangiorgio M, Dercole F 2020 Chaos, Solitons Fractals 139 110045Google Scholar

    [9]

    李世玺, 孙宪坤, 尹玲, 张仕森 2020 导航定位学报 8 65Google Scholar

    Li S X, Sun X K, Yin L, Zhang S S 2020 Journal of Navigation and Positionina 8 65Google Scholar

    [10]

    Boullé N, Dallas V, Nakatsukasa Y, Samaddar D 2019 Physica D 403 132261Google Scholar

    [11]

    唐舟进, 任峰, 彭涛, 王文博 2014 物理学报 63 050505Google Scholar

    Tang Z J, Ren F, Peng T, Wang W B 2014 Acta Phys. Sin. 63 050505Google Scholar

    [12]

    田中大, 高宪文, 石彤 2014 物理学报 63 160508Google Scholar

    Tian Z D, Gao X W, Shi T 2014 Acta Phys. Sin. 63 160508Google Scholar

    [13]

    王新迎, 韩敏 2015 物理学报 64 070504Google Scholar

    Wang Y X, Han M 2015 Acta Phys. Sin. 64 070504Google Scholar

    [14]

    吕金虎 2002 混沌时间序列分析及其应用 (武汉: 武汉大学出版社) 第57−60页

    Lu J H 2002 Chaotic Time Series Analysis and Application (Wuhan: Wuhan University Press) pp57−60 (in Chinese)

    [15]

    Packard N, Crutchfield J P, Shaw R 1980 Phys. Rev. Lett. 45 712Google Scholar

    [16]

    Takens F 1981 Dynamical Systems and Turbulence (Berlin: Springer) pp366−381

    [17]

    Martinerie J M, Albano A M, Mees A I, Rapp P E 1992 Phys. Rev. A 45 7058Google Scholar

    [18]

    Liangyue C 1997 Physica D 110 43Google Scholar

    [19]

    Lecun Y, Boser B, Denker J, Henderson D, Howard R, Hubbard W, Jackel L 1989 Neural Comput. 1 541Google Scholar

    [20]

    Goodfellow I, Bengio Y, Courville A 2016 Deep Learning (Cambridge: The MIT Press) p326

    [21]

    Kim Y 2014 arXiv: 1408.5882 [cs.CL]

    [22]

    Pascanu R, Mikolov T, Bengio Y 2013 Proceedings of the 30th International Conference on International Conference on Machine Learning Atlanta, USA, June 16−21 2013, p1310

    [23]

    Hochreiter S, Schmidhuber J 1997 Neural Comput. 9 1735Google Scholar

    [24]

    Chung J, Gulcehre C, Cho K, Bengio Y 2014 arXiv: 1412.3555 [cs.NE]

    [25]

    Mnih V, Heess N, Graves A, Kavukcuoglu K 2014 Proceedings of the 27th International Conference on Neural Information Processing Systems Montreal, Canada, December 8−13 2014, p2204

    [26]

    Yin W, Schütze H, Xiang B, Zhou B 2015 arXiv: 1512.05193 [cs.CL]

    [27]

    Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez A N, Kaiser L, Polosukhin I 2017 Proceedings of the 31st International Conference on Neural Information Processing Systems Long Beach, USA, December 4−9 2017, p6000

Metrics
  • Abstract views:  9646
  • PDF Downloads:  469
  • Cited By: 0
Publishing process
  • Received Date:  12 June 2020
  • Accepted Date:  19 July 2020
  • Available Online:  15 December 2020
  • Published Online:  05 January 2021

/

返回文章
返回