-
Chaos is seemingly irregular and analogous to random movement happening in a determinative system in nature,and more and more types and numbers of time series with chaotic characteristics are obtained from the actual systems,such as atmospheric circulation,temperature,rainfall,sunspots,and the Yellow River flow.The chaotic time series prediction has become a research hotspot in recent years.Because neural network can be strongly approximated nonlinearly,it has better prediction performance in the chaotic time series modeling.Extreme learning machine is a kind of neural network, and it is widely used due to its simple structure,high learning efficiency and having global optimal solution.Extreme learning machine initializes the input weight randomly and just adjusts the output weight in the training process,in order to be able to obtain the global optimal solution,so it has faster convergence speed and can overcome the disadvantage of gradient vanishing.Due to the above advantages,in recent years,the improved algorithms of the extreme learning machine have been developed rapidly.However,the traditional training methods of extreme learning machine have very poor robustness and can be affected easily by noise and outliers.And in practical applications,the time series are often contaminated by noise and outliers,so it is important to improve the forecasting model robustness and reduce the influence of noise and abnormal points to obtain better prediction accuracy.In this paper,a robust extreme learning machine is proposed in a Bayesian framework to solve the problem that outliers exist in the training data set.Firstly,the input samples are mapped onto the high-dimensional space,and the output weight of the extreme learning machine is used as the parameter to be estimated,then the proposed model utilizes the more robust Gaussian mixture distribution as the likelihood function of the model output.The marginal likelihood of the model output is analytically intractable for the Gaussian mixture distribution,so a variational procedure is introduced to realize the parameter estimation.In the cases of different noise levels and the different numbers of outliers,the proposed model is compared with the other prediction models.The experimental results of Lorenz,Rossler and Sunspot-Runoff in the Yellow River time series with outliers and noise demonstrate that the proposed robust extreme learning machine model could obtain a better prediction accuracy.The proposed robust extreme learning machine not only has the strong capability of the nonlinear approximation but also can learn the model parameters automatically and has strong robustness.At the same time,the time complexities of different models are compared and the convergence of the proposed model is analyzed at the end of the paper.
-
Keywords:
- extreme learning machine /
- robust /
- chaotic time series /
- prediction
[1] Xiu C B, Xu M 2010 Acta Phys. Sin. 59 7650 (in Chinese) [修春波, 徐勐 2010 物理学报 59 7650]
[2] Han M, Xu M L 2013 Acta Phys. Sin. 62 120510 (in Chinese) [韩敏, 许美玲 2013 物理学报 62 120510]
[3] Zhang J S, Xiao X C 2000 Acta Phys. Sin. 49 403 (in Chinese) [张家树, 肖先赐 2000 物理学报 49 403]
[4] Li D C, Han M, Wang J 2012 IEEE Trans. Neural Netw. Learn. Syst. 23 787
[5] Wang X Y, Han M 2015 Acta Phys. Sin. 64 070504 (in Chinese) [王新迎, 韩敏 2015 物理学报 64 070504]
[6] Li R G, Zhang H L, Fan W H, Wang Y 2015 Acta Phys. Sin. 64 200506 (in Chinese) [李瑞国, 张宏立, 范文慧, 王雅 2015 物理学报 64 200506]
[7] Chandra R, Ong Y S, Goh C K 2017 Neurocomputing 243 21
[8] Politi A 2017 Phys. Rev. Lett. 118 144101
[9] Ye B, Chen J, Ju C 2017 Comput. Nonlin. Scien. Num. Simul. 44 284
[10] Koskela T, Lehtokangas M, Saarinen J, Kask K 1996 Proceedings of the World Congress on Neural Networks (San Diego: INNS Press) p491
[11] Jaeger H, Haas H 2004 Science 304 78
[12] Dutoit X, Schrauwen B, van Campenhout J 2009 Neurocomputing 72 1534
[13] Ma Q L, Zheng Q L, Peng H, Tan J W 2009 Acta Phys. Sin. 58 1410 (in Chinese) [马千里, 郑启伦, 彭宏, 覃姜维 2009 物理学报 58 1410]
[14] Huang G B, Zhu Q Y, Siew C K 2006 Neurocomputing 70 489
[15] Soria-Olivas E, Gomez-Sanchis J, Martin J D 2011 IEEE Trans. Neural Netw. 22 505
[16] Huang G B, Wang D H, Lan Y 2011 Int. J. Mach. Learn. Cybern. 2 107
[17] Han M, Xi J, Xu S 2004 IEEE Trans. Sig. Proc. 52 3409
[18] Liu X, Wang L, Huang G B 2015 Neurocomputing 149 253
[19] Lu H, Du B, Liu J 2017 Memet. Comput. 9 121
[20] Wang X, Han M 2015 Engin. Appl. Artif. Intell. 40 28
[21] Tang J, Deng C, Huang G B 2016 IEEE Trans. Neural Netw. Learn. Syst. 27 809
[22] Huang G B, Zhou H, Ding X 2012 IEEE Trans. Syst. Man Cybern. B 42 513
[23] Tipping M E, Lawrence N D 2005 Neurocomputing 69 123
[24] Tipping M E 2001 J. Mach. Learn. Res. 1 211
[25] Faul A C, Tipping M E 2001 International Conference on Artificial Neural Networks Vienna, Austria, August 21-25, 2001 p95
[26] Wang B, Titterington D M 2006 Bayes. Analys. 1 625
-
[1] Xiu C B, Xu M 2010 Acta Phys. Sin. 59 7650 (in Chinese) [修春波, 徐勐 2010 物理学报 59 7650]
[2] Han M, Xu M L 2013 Acta Phys. Sin. 62 120510 (in Chinese) [韩敏, 许美玲 2013 物理学报 62 120510]
[3] Zhang J S, Xiao X C 2000 Acta Phys. Sin. 49 403 (in Chinese) [张家树, 肖先赐 2000 物理学报 49 403]
[4] Li D C, Han M, Wang J 2012 IEEE Trans. Neural Netw. Learn. Syst. 23 787
[5] Wang X Y, Han M 2015 Acta Phys. Sin. 64 070504 (in Chinese) [王新迎, 韩敏 2015 物理学报 64 070504]
[6] Li R G, Zhang H L, Fan W H, Wang Y 2015 Acta Phys. Sin. 64 200506 (in Chinese) [李瑞国, 张宏立, 范文慧, 王雅 2015 物理学报 64 200506]
[7] Chandra R, Ong Y S, Goh C K 2017 Neurocomputing 243 21
[8] Politi A 2017 Phys. Rev. Lett. 118 144101
[9] Ye B, Chen J, Ju C 2017 Comput. Nonlin. Scien. Num. Simul. 44 284
[10] Koskela T, Lehtokangas M, Saarinen J, Kask K 1996 Proceedings of the World Congress on Neural Networks (San Diego: INNS Press) p491
[11] Jaeger H, Haas H 2004 Science 304 78
[12] Dutoit X, Schrauwen B, van Campenhout J 2009 Neurocomputing 72 1534
[13] Ma Q L, Zheng Q L, Peng H, Tan J W 2009 Acta Phys. Sin. 58 1410 (in Chinese) [马千里, 郑启伦, 彭宏, 覃姜维 2009 物理学报 58 1410]
[14] Huang G B, Zhu Q Y, Siew C K 2006 Neurocomputing 70 489
[15] Soria-Olivas E, Gomez-Sanchis J, Martin J D 2011 IEEE Trans. Neural Netw. 22 505
[16] Huang G B, Wang D H, Lan Y 2011 Int. J. Mach. Learn. Cybern. 2 107
[17] Han M, Xi J, Xu S 2004 IEEE Trans. Sig. Proc. 52 3409
[18] Liu X, Wang L, Huang G B 2015 Neurocomputing 149 253
[19] Lu H, Du B, Liu J 2017 Memet. Comput. 9 121
[20] Wang X, Han M 2015 Engin. Appl. Artif. Intell. 40 28
[21] Tang J, Deng C, Huang G B 2016 IEEE Trans. Neural Netw. Learn. Syst. 27 809
[22] Huang G B, Zhou H, Ding X 2012 IEEE Trans. Syst. Man Cybern. B 42 513
[23] Tipping M E, Lawrence N D 2005 Neurocomputing 69 123
[24] Tipping M E 2001 J. Mach. Learn. Res. 1 211
[25] Faul A C, Tipping M E 2001 International Conference on Artificial Neural Networks Vienna, Austria, August 21-25, 2001 p95
[26] Wang B, Titterington D M 2006 Bayes. Analys. 1 625
计量
- 文章访问数: 6800
- PDF下载量: 299
- 被引次数: 0