[1] 宗晓萍,武子瀚,刘言. 基于遗传算法寻优的SVR雾霾预测模型[J]. 河北大学学报(自然科学版), 2016, 36(3): 307-311. DOI: 10.3969/j.issn.1000-1565.2016.03.014. [2] 姜艳波,徐宁伟,陈泰熙,等. 基于BP神经网络的土地利用智能分类识别与雨洪风险模拟[J]. 河北大学学报(自然科学版),2024, 44(2): 208-215. DOI: 10.3969/j.issn.1000-1565.2024.02.012. [3] CHEN Y, BAI M, ZHANG Y, et al. Multivariable space-time correction for wind speed in numerical weather prediction(NWP)based on ConvLSTM and the prediction of probability interval[J]. Earth Sci Inform, 2023, 16(3): 1953-1974. DOI: 10.1007/s12145-023-01036-1. [4] GLAHN H R, LOWRY D A. The use of model output statistics(MOS)in objective weather forecasting[J]. J Appl Meteor, 1972, 11(8): 1203-1211. DOI: 10.1175/1520-0450(1972)011<1203:TUOMOS>2.0.CO;2. [5] WANG A, XU L, LI Y, et al. Random-forest based adjusting method for wind forecast of WRF model[J]. Comput Geosci, 2021, 155: 104842. DOI: 10.1016/j.cageo.2021.104842. [6] YE L, DAI B, LI Z, et al. An ensemble method for short-term wind power prediction considering error correction strategy[J]. Appl Energy, 2022, 322: 119475. DOI: 10.1016/j.apenergy.2022.119475. [7] YANG Y, ZHA K, CHEN Y, et al. Delving into deep imbalanced regression[C] //International Conference on Machine Learning, PMLR, 2021: 11842-11851. [8] FERNáNDEZ A, GARCIA S, HERRERA F, et al. SMOTE for learning from imbalanced data: progress and challenges, marking the 15-year anniversary[J]. Journal of artificial intelligence research, 2018, 61: 863-905. DOI: 10.1613/jair.1.11192. [9] TORGO L, RIBEIRO R P, PFAHRINGER B, et al. SMOTE for Regression[M] //CORREIA L, REIS L P, CASCALHO J. Progress in Artificial Intelligence, Berlin, Heidelberg: Springer, 2013: 378-389. DOI: 10.1007/978-3-642-40669-0_33. [10] BRANCO P, TORGO L, RIBEIRO R P. SMOGN: a pre-processing approach for imbalanced regression[C] //First international workshop on learning with imbalanced domains: Theory and applications. PMLR, 2017: 36-50. [11] PELáEZ-RODRíGUEZ C, PéREZ-ARACIL J, PRIETO-GODINO L, et al. A fuzzy-based cascade ensemble model for improving extreme wind speeds prediction[J]. J Wind Eng Ind Aerodyn, 2023, 240: 105507. DOI: 10.1016/j.jweia. 2023.105507. [12] KE S W, LIN W C, TSAI C F, et al. Soft estimation by hierarchical classification and regression[J]. Neurocomputing, 2017, 234: 27-37. DOI: 10.1016/j.neucom.2016.12.037. [13] PELÁEZ-RODRÍGUEZ C, PÉREZ-ARACIL J, FISTER D, et al. A hierarchical classification/regression algorithm for improving extreme wind speed events prediction[J]. Renew Energy, 2022, 201: 157-178. DOI: 10.1016/j.renene.2022.11.042. [14] PROKHORENKOVA L, GUSEV G, VOROBEV A, et al. CatBoost: unbiased boosting with categorical features[EB/OL]. 2017: arXiv: 1706.09516. http://arxiv.org/abs/1706.09516. [15] KE G, MENG Q, FINLEY T, et al. Lightgbm: A highly efficient gradient boosting decision tree[J]. Advances in neural information processing systems, 2017, 30. DOI: 10.5555/3294996.3295074. [16] SIMIU E, HECKERT N A. Extreme wind distribution tails: A “peaks over threshold” approach[J]. Journal of Structural Engineering, 1996, 122(5): 539-547. DOI: 10.1061/(ASCE)0733-9445(1996)122:5(539). [17] DAVISON A C. Modelling excesses over high thresholds, with an application[M] //OLIVEIRA J T. Statistical Extremes and Applications. Dordrecht: Springer Netherlands, 1984: 461-482. DOI: 10.1007/978-94-017-3069-3_34. [18] DEL SER J, CASILLAS-PEREZ D, CORNEJO-BUENO L, et al. Randomization-based machine learning in renewable energy prediction problems: critical literature review, new results and perspectives[J]. Applied Soft Computing, 2022, 118: 108526. DOI: 10.1016/j.asoc.2022.108526. [19] BREIMAN L. Random forests[J]. Mach Learn, 2001, 45(1): 5-32. DOI: 10.1023/A:1010933404324. [20] AWAD M, KHANNA R. Support vector regression[J]. Efficient Learning Machines, 2015, 10: 978-1. DOI: 10.1007/978-1-4302-5990-9_4. [21] GARDNER M W, DORLING S R. Artificial neural networks(the multilayer perceptron)—a review of applications in the atmospheric sciences[J]. Atmos Environ, 1998, 32(14-15): 2627-2636. DOI:10.1016/S1352-2310(97)00447-0. [22] LIU Y, DONG H, WANG X, et al. Time series prediction based on temporal convolutional network[C] //2019 IEEE/ACIS 18th International Conference on Computer and Information Science(ICIS), IEEE, 2019: 300-305. DOI: 10.1109/ICIS46139. 2019.8940265. [23] GRAVES A. Long Short-Term Memory[M] //GRAVES A. Supervised Sequence Labelling with Recurrent Neural Networks, Berlin, Heidelberg: Springer Berlin Heidelberg, 2012: 37-45. DOI: 10.1007/978-3-642-24797-2_4. [24] VASWANI A, SHAZEER N, PARMAR N. 2017. Attention is all you need[C]. Advances in Neural Information Processing Systems. 30: 6000- 6010. Long Beach, CA, USA. [25] ZHOU T, MA Z Q, WEN Q S, et al. FEDformer: frequency enhanced decomposed transformer for long-term series forecasting[EB/OL]. 2022: arXiv: 2201.12740. http://arxiv.org/abs/2201.12740. [26] NIE Y Q, NGUYEN N H, SINTHONG P, et al. A time series is worth 64 words: long-term forecasting with transformers[EB/OL]. 2022: arXiv: 2211.14730. http://arxiv.org/abs/2211.14730. ( |