PDF
Abstract
Mathematical modelling is fundamental to understanding real-world phenomena. Despite the inherent complexity in designing such models, numerical approaches and, more recently, machine learning techniques, have emerged as powerful tools in this area. This work proposes integrating the finite element method (FEM) into forecasting and introduces parallel techniques for regression problems, with a specific focus on the use of Matérn kernels on local mesh support. This approach generalises the modelling based on radial basis function kernels and offers more flexibility to control the smoothness of the modelled functions. An exhaustive study explores the impact of diverse norms and Matérn kernel variations on the performance of models, and aims to improve the computational efficiency of the model fitting and prediction processes. Furthermore, a heuristic framework is introduced to derive optimal complexity parameters for each Matérn-based FEM kernel. The proposed parallel approaches use dynamic strategies, which significantly reduce the computational time of the algorithms compared to other methods and parallel computing techniques presented in recent years. The proposed methodology is assessed in the context of bias corrections for temperature forecasts made by the Local Data Assimilation and Prediction System (LDAPS) model. A comprehensive comparative analysis which includes machine learning algorithms provides significant insights into the training process, norm selection, and kernel choice, and shows that Matérn-based methods emerge as a choice to be considered for regression problems.
Keywords
Numerical modelling
/
Parallel programming
/
Matérn kernels
/
Machine learning
/
Information and Computing Sciences
/
Artificial Intelligence and Image Processing
/
Computation Theory and Mathematics
/
Mathematical Sciences
/
Numerical and Computational Mathematics
/
Statistics
Cite this article
Download citation ▾
Violeta Migallón, Héctor Penadés, José Penadés.
Design and Comparison of Parallel Dynamic Matérn Kernel-Based Regression Models and Machine Learning Approaches: Application to Bias Correction in Numerical Weather Prediction.
Communications on Applied Mathematics and Computation 1-42 DOI:10.1007/s42967-025-00490-6
| [1] |
Abramowitz, M., Stegun, I.A.: Handbook of Mathematical Functions: with Formulas, Graphs, and Mathematical Tables. Applied Mathematics Series. Dover Publications, Mineola (1972)
|
| [2] |
AggarwalCCNeural Networks and Deep Learning, 2018New YorkSpringer.
|
| [3] |
Apaydın, M., Yumuş, M., Değirmenci, A., Karal, Ö.: Evaluation of air temperature with machine learning regression methods using Seoul City meteorological data. Pamukkale Univ. J. Eng. Sci. 28(5), 737–747 (2022)
|
| [4] |
ApsemidisA, PsarakisS, MoguerzaJM. A review of machine learning kernel methods in statistical process monitoring. Comput. Ind. Eng., 2020, 142. 106376
|
| [5] |
AragonésL, PagánJI, LópezI, Navarro-GonzálezFJ, VillacampaY. Galerkin’s formulation of the finite elements method to obtain the depth of closure. Sci. Total Environ., 2019, 660: 1256-1263.
|
| [6] |
AwadM, KhannaREfficient Learning Machines: Theories, Concepts, and Applications for Engineers and System Designers, 2015BerkeleyApress Media.
|
| [7] |
Azadkia, M.: Optimal choice of k\documentclass[12pt]{minimal}\usepackage{amsmath}\usepackage{wasysym}\usepackage{amsfonts}\usepackage{amssymb}\usepackage{amsbsy}\usepackage{mathrsfs}\usepackage{upgreek}\setlength{\oddsidemargin}{-69pt}\begin{document}$$k$$\end{document} for k\documentclass[12pt]{minimal}\usepackage{amsmath}\usepackage{wasysym}\usepackage{amsfonts}\usepackage{amssymb}\usepackage{amsbsy}\usepackage{mathrsfs}\usepackage{upgreek}\setlength{\oddsidemargin}{-69pt}\begin{document}$$k$$\end{document}-nearest neighbor regression (2020). https://doi.org/10.48550/ARXIV.1909.05495
|
| [8] |
BalázsováM, FeistauerM, HadravaM, KosíkA. On the stability of the space-time discontinuous Galerkin method for the numerical solution of nonstationary nonlinear convection-diffusion problems. J. Numer. Math., 2015, 233211-233.
|
| [9] |
Baran, S., Horányi, A., Nemoda, D.: Comparison of BMA and EMOS statistical calibration methods for temperature and wind speed ensemble weather prediction (2013). https://doi.org/10.48550/arXiv.1312.3763
|
| [10] |
BaranS, LakatosM. Clustering-based spatial interpolation of parametric postprocessing models. Weather Forecast., 2024, 39111591-1604.
|
| [11] |
Barbour, S.L., Krahn, J.: Numerical modelling—prediction or process? In: Geotechnical News, pp. 44–52 (2004)
|
| [12] |
BeisiegelN, VaterS, BehrensJ, DiasF. An adaptive discontinuous Galerkin method for the simulation of hurricane storm surge. Ocean Dyn., 2020, 70: 641-666.
|
| [13] |
BerrarD RanganathanS, GribskovM, NakaiK, SchönbachC. Cross-validation. Encyclopedia of Bioinformatics and Computational Biology, 2019CambridgeAcademic Press542-545.
|
| [14] |
BiK, XieL, ZhangH, ChenX, GuX, TianQ. Accurate medium-range global weather forecasting with 3D neural networks. Nature, 2023, 619: 533-538.
|
| [15] |
BiauG, ScornetE. A random forest guided tour. TEST, 2016, 25: 197-227.
|
| [16] |
BiggsD, De VilleB, SuenE. A method of choosing multiway partitions for classification and decision trees. J. Appl. Stat., 1991, 18149-62.
|
| [17] |
BreimanL. Random forests. Mach. Learn., 2001, 45: 5-32.
|
| [18] |
BreimanL, FriedmanJH, OlshenRA, StoneCJClassification and Regression Trees, 19841New YorkChapman and Hall/CRC.
|
| [19] |
BühlmannPGentleJ, HärdleW, MoriY. Bagging, boosting and ensemble methods. Handbook of Computational Statistics. Springer Handbooks of Computational Statistics, 2012BerlinSpringer985-1022.
|
| [20] |
ChaiT, DraxlerRR. Root mean square error (RMSE) or mean absolute error (MAE)? Arguments against avoiding RMSE in the literature. Geosci. Model Dev., 2014, 731247-1250.
|
| [21] |
ChanJ. Weight-adjusted discontinuous Galerkin methods: matrix-valued weights and elastic wave propagation in heterogeneous media. Int. J. Numer. Methods Eng., 2018, 113121779-1809.
|
| [22] |
ChanJ, HewettRJ, WarburtonT. Weight-adjusted discontinuous Galerkin methods: wave propagation in heterogeneous media. SIAM J. Sci. Comput., 2017, 3662935-2961.
|
| [23] |
Chen, T., Guestrin, C.: XGBoost: a scalable tree boosting system. In: 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016). https://doi.org/10.1145/2939672.2939785
|
| [24] |
ChengWYY, SteenburghWJ. Strengths and weaknesses of MOS, running-mean bias removal, and Kalman filter techniques for improving model forecasts over the Western United States. Weather Forecast., 2007, 2261304-1318.
|
| [25] |
ChoD, YooC, ImJ, ChaD-H. Comparative assessment of various machine learning-based bias correction methods for numerical weather prediction model forecasts of extreme air temperatures in urban areas. Earth Space Sci., 2020, 742019–e2019EA000740.
|
| [26] |
ChorleyMJ, WalkerDW. Performance analysis of a hybrid MPI/OpenMP application on multi-core clusters. J. Comput. Sci., 2010, 13168-174.
|
| [27] |
CifuentesJ, MarulandaG, BelloA, RenesesJ. Air temperature forecasting using machine learning techniques: a review. Energies, 2020, 13164215.
|
| [28] |
ConstantinA, JohnsonRS. On the modelling of large-scale atmospheric flow. J. Differ. Equ., 2021, 285: 751-798.
|
| [29] |
DehghanM, NarimaniN. An element-free Galerkin meshless method for simulating the behavior of cancer cell invasion of surrounding tissue. Appl. Math. Model., 2018, 59: 500-513.
|
| [30] |
DuraiVR, BhradwajR. Evaluation of statistical bias correction methods for numerical weather prediction model forecasts of maximum and minimum temperatures. Nat. Hazards, 2014, 73: 1229-1254.
|
| [31] |
FreundY, SchapireRE. A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci., 1997, 551119-139.
|
| [32] |
FriedmanJH. Greedy function approximation: a gradient boosting machine. Ann. Stat., 2001, 2951189-1232.
|
| [33] |
FriedmanJH. Stochastic gradient boosting. Comput. Stat. Data Anal., 2002, 384367-378.
|
| [34] |
GalanisG, AnadranistakisM. A one-dimensional Kalman filter for the correction of near surface temperature forecasts. Meteorol. Appl., 2002, 94437-441.
|
| [35] |
GallagherRHFinite Element Analysis: Fundamentals, 1975Englewood CliffsPrentice-Hall
|
| [36] |
Gillijns, S., Mendoza, O.B., Chandrasekar, J., De Moor, B.L.R., Bernstein, D.S., Ridley, A.: What is the ensemble Kalman filter and how well does it work? In: 2006 American Control Conference, pp. 6 (2006). https://doi.org/10.1109/ACC.2006.1657419
|
| [37] |
GneitingT, RafteryAE, WestveldAH, GoldmanT. Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation. Mon. Weather Rev., 2005, 13351098-1118.
|
| [38] |
GuM, WangX, BergerJO. Robust Gaussian stochastic process emulation. Ann. Stat., 2018, 466A3038-3066.
|
| [39] |
HatamiMWeighted Residual Methods: Principles, Modifications and Applications, 2017CambridgeAcademic Press
|
| [40] |
HsiehWWMachine Learning Methods in the Environmental Sciences: Neural Networks and Kernels, 2009VancouverCambridge University Press.
|
| [41] |
JouanG, CuzolA, MonbetV, MonnierG. Gaussian mixture models for clustering and calibration of ensemble weather forecasts. Discrete Contin. Dyn. Syst. S, 2023, 162309-328.
|
| [42] |
KalmanRE. A new approach to linear filtering and prediction problems. J. Basic Eng., 1960, 82135-45.
|
| [43] |
Karvonen, T.: Asymptotic bounds for smoothness parameter estimates in Gaussian process interpolation. SIAM-ASA J. Uncertain. Quantif. 11(4), 1225–1257 (2023). https://doi.org/10.1137/22M149288X
|
| [44] |
KassGV. An exploratory technique for investigating large quantities of categorical data. J. R. Stat. Soc. C Appl. Stat., 1980, 292119-127.
|
| [45] |
Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: a highly efficient gradient boosting decision tree. In: Proceedings of the 31st International Conference on Neural Information Processing Systems. NIPS’17, pp. 3149–3157. Curran Associates Inc., Red Hook (2017)
|
| [46] |
KhodarahmiM, MaihamiV. A review on Kalman filter models. Arch. Comput. Methods Eng., 2023, 30: 727-747.
|
| [47] |
KimHM, HanSS. Seoul. Cities, 2012, 292142-154.
|
| [48] |
KimY-H, BaikJ-J. Spatial and temporal structure of the urban heat island in Seoul. J. Appl. Meteorol. Climatol., 2005, 445591-605.
|
| [49] |
Klein, W.H., Glahn, H.R.: Forecasting local weather by means of model output statistics. Bull. Am. Meteorol. Soc. 55(10), 1217–1227 (1974)
|
| [50] |
KochkovD, YuvalJ, LangmoreI, NorgaardP, SmithJ, MooersG, KlöwerM, LottesJ, RaspS, DübenP, HatfieldS, BattagliaP, Sanchez-GonzalezA, WillsonM, BrennerMP, HoyerS. Neural general circulation models for weather and climate. Nature, 2024, 632: 1060-1066.
|
| [51] |
KonyaA, NematzadehP. Recent applications of AI to environmental disciplines: a review. Sci. Total Environ., 2024, 906. 167705
|
| [52] |
LamR, Sanchez-GonzalezA, WillsonM, WirnsbergerP, FortunatoM, AletF, RavuriS, EwaldsT, Eaton-RosenZ, HuW, MeroseA, HoyerS, HollandG, VinyalsO, StottJ, PritzelA, MohamedS, BattagliaP. Learning skillful medium-range global weather forecasting. Science, 2023, 38266771416-1421.
|
| [53] |
LeT-T, LeMV. Development of user-friendly kernel-based Gaussian process regression model for prediction of load-bearing capacity of square concrete-filled steel tubular members. Mater. Struct., 2021, 54: 59.
|
| [54] |
LimY-F, NgCK, VaitesswarUS, HippalgaonkarK. Extrapolative Bayesian optimization with Gaussian process and neural network ensemble surrogate models. Adv. Intell. Syst., 2021, 3112100101.
|
| [55] |
LiuX, LuD, ZhangA, LiuQ, JiangG. Data-driven machine learning in environmental pollution: gains and problems. Environ. Sci. Technol., 2022, 5642124-2133.
|
| [56] |
LópezI, AragonésL, VillacampaY, Navarro-GonzálezFJ. Gravel beaches nourishment: modelling the equilibrium beach profile. Sci. Total Environ., 2018, 619–620: 772-783.
|
| [57] |
MaX, LiuH, DongQ, ChenQ, CaiN. Statistical post-processing of multiple meteorological elements using the multimodel integration embedded method. Atmos. Res., 2024, 301. 107269
|
| [58] |
Makarova, E., Portabella, M., Stoffelen, A., Li, G., Lin, W.: Correction of NWP ocean surface wind biases with machine learning. In: IGARSS 2024—2024 IEEE International Geoscience and Remote Sensing Symposium, Athens, Greece, pp. 7560–7563 (2024). https://doi.org/10.1109/IGARSS53475.2024.10642503
|
| [59] |
MarrasS, KellyJF, MoraguesM, MüllerA, KoperaMA, VázquezM, GiraldoFX, HouzeauxG, JorbaO. A review of element-based Galerkin methods for numerical weather prediction: finite elements, spectral elements, and discontinuous Galerkin. Arch. Comput. Methods Eng., 2016, 23: 673-722.
|
| [60] |
MatérnBSpatial Variation. Lecture Notes in Statistics, 1986New YorkSpringer.
|
| [61] |
Mathault, J., Landari, H., Tessier, F., Fortier, P., Miled, A.: Biological modeling challenges in a multiphysics approach. In: 2017 IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS), pp. 88–91 (2017). https://doi.org/10.1109/MWSCAS.2017.8052867
|
| [62] |
McDonaldGC. Ridge regression. WIREs Comput. Stat., 2009, 1193-100.
|
| [63] |
MesutB, BaşkorA, Buket AksuNPhilipA, ShahiwalaA, RashidM, FaiyazuddinM. Chapter 3—Role of artificial intelligence in quality profiling and optimization of drug products. A Handbook of Artificial Intelligence in Drug Delivery, 2023CambridgeAcademic Press35-54.
|
| [64] |
Michaelsen, J.: Cross-validation in statistical climate forecast models. J. Appl. Meteorol. Climatol. 26(11), 1589–1600 (1987)
|
| [65] |
MigallónV, Navarro-GonzálezFJ, PenadésH, PenadésJ, VillacampaY. A parallel methodology using radial basis functions versus machine learning approaches applied to environmental modelling. J. Comput. Sci., 2022, 63. 101817
|
| [66] |
MigallónV, Navarro-GonzálezFJ, PenadésH, PenadésJ, VillacampaY. Designing a parallel nonlinear model for predicting nitrogen oxide emissions. AIP Conf. Proc., 2024, 30941. 500001
|
| [67] |
MigallónV, Navarro-GonzálezFJ, PenadésJ, VillacampaY. Parallel approach of a Galerkin-based methodology for predicting the compressive strength of the lightweight aggregate concrete. Constr. Build. Mater., 2019, 219: 56-68.
|
| [68] |
MinS-K, SimonisD, HenseA. Probabilistic climate change predictions applying Bayesian model averaging. Philos. Trans. R. Soc. A, 2007, 365: 2103-2116.
|
| [69] |
Navarro-GonzálezFJ, VillacampaY. A new methodology for complex systems using n\documentclass[12pt]{minimal}\usepackage{amsmath}\usepackage{wasysym}\usepackage{amsfonts}\usepackage{amssymb}\usepackage{amsbsy}\usepackage{mathrsfs}\usepackage{upgreek}\setlength{\oddsidemargin}{-69pt}\begin{document}$$n$$\end{document}-dimensional finite elements. Adv. Eng. Softw., 2012, 48: 52-57.
|
| [70] |
Navarro-GonzálezFJ, VillacampaY. Generation of representation models for complex systems using Lagrangian functions. Adv. Eng. Softw., 2013, 64: 33-37.
|
| [71] |
Navarro-GonzálezFJ, VillacampaY. A finite element numerical algorithm for modelling and data fitting in complex systems. Int. J. Comput. Methods Exp. Meas., 2016, 42100-113.
|
| [72] |
Navarro-González, F.J., Villacampa, Y., Cortés-Molina, M., Ivorra, S.: Numerical non-linear modelling algorithm using radial kernels on local mesh support. Mathematics 8(9), 1600 (2020). https://doi.org/10.3390/math8091600
|
| [73] |
Nwankpa, C., Ijomah, W., Gachagan, A., Marshall, S.: Activation functions: comparison of trends in practice and research for deep learning (2018). https://arxiv.org/pdf/1811.03378
|
| [74] |
OlsonR, FanY, EvansJP. A simple method for Bayesian model averaging of regional climate model projections: application to southeast Australian temperatures. Geophys. Res. Lett., 2016, 43147661-7669.
|
| [75] |
OpenMP: OpenMP official site. Accessed in May, 2024 (2012). http://www.openmp.org
|
| [76] |
OuyangZ-L, LiuS-Y, ZouZ-J. Nonparametric modeling of ship maneuvering motion in waves based on Gaussian process regression. Ocean Eng., 2022, 264. 112100
|
| [77] |
PalA, AgarwalaA, RahaS, BhattacharyaB. Performance metrics in a hybrid MPI-OpenMP based molecular dynamics simulation with short-range interactions. J. Parallel Distrib. Comput., 2014, 7432203-2214.
|
| [78] |
Palazón, A., López, I., Aragonés, L., Villacampa, Y., Navarro-González, F.J.: Modelling of Escherichia coli concentrations in bathing water at microtidal coasts. Sci. Total Environ. 593/594, 173–181 (2017). https://doi.org/10.1016/j.scitotenv.2017.03.161
|
| [79] |
PedregosaF, VaroquauxG, GramfortA, MichelV, ThirionB, GriselO, BlondelM, PrettenhoferP, WeissR, DubourgV, VanderplasJ, PassosA, CournapeauD, BrucherM, PerrotM, DuchesnayE. Scikit-learn: machine learning in Python. J. Mach. Learn. Res., 2011, 12: 2825-2830
|
| [80] |
PriceI, Sanchez-GonzalezA, AletF, AnderssonTR, El-KadiA, MastersD, EwaldsT, StottJ, MohamedS, BattagliaP, LamR, WillsonM. Probabilistic weather forecasting with machine learning. Nature, 2025, 637: 84-90.
|
| [81] |
QinC, TianJ, YangX, LiuK, YanG, FengJ, LvY, XuM. Galerkin-based meshless methods for photon transport in the biological tissue. Opt. Express, 2008, 162520317-20333.
|
| [82] |
Rabenseifner, R., Hager, G., Jost, G.: Hybrid MPI/OpenMP parallel programming on clusters of multi-core SMP nodes. In: 2009 17th Euromicro International Conference on Parallel, Distributed and Network-Based Processing, pp. 427–436 (2009). https://doi.org/10.1109/PDP.2009.43
|
| [83] |
RafteryAE, GneitingT, BalabdaouiF, PolakowskiM. Using Bayesian model averaging to calibrate forecast ensembles. Mon. Weather Rev., 2005, 13351155-1174.
|
| [84] |
RaissiM, PerdikarisP, KarniadakisGE. Machine learning of linear differential equations using Gaussian processes. J. Comput. Phys., 2017, 348: 683-693.
|
| [85] |
Rashmi, K.V., Gilad-Bachrach, R.: DART: dropouts meet multiple additive regression trees (2015). https://doi.org/10.48550/arXiv.1505.01866
|
| [86] |
RuoccoG , CaccavaleP, De BonisMVKunduSC, ReisRL. Chapter 18—A predictive oncology framework-modeling tumor proliferation using a FEM platform. Biomaterials for 3D Tumor Modeling. Materials Today, 2020AmsterdamElsevier427-450.
|
| [87] |
Saminathan, S., Mitra, S.: Probabilistic post-processing of short to medium range temperature forecasts: implications for heatwave prediction in India. Environ. Monit. Assess. 196, 284 (2024). https://doi.org/10.1007/s10661-024-12418-3
|
| [88] |
ShamshiriR, IsmailWIW. Implementation of Galerkin’s method and modal analysis for unforced vibration response of a tractor suspension model. Res. J. Appl. Sci. Eng. Technol., 2014, 7149-55.
|
| [89] |
Shi, J., Shirali, A., Jin, B., Zhou, S., Hu, W., Rangaraj, R., Wang, S., Han, J., Wang, Z., Lall, U., Wu, Y., Bobadilla, L., Narasimhan, G.: Deep learning and foundation models for weather prediction: a survey (2025). arXiv:2501.06907
|
| [90] |
SmolaAJ, SchölkopfB. A tutorial on support vector regression. Stat. Comput., 2004, 143199-222.
|
| [91] |
Snir, M., Otto, S., Huss-Lederman, S., Walker, D., Dongarra, J.: MPI: the Complete Reference, 2nd edn. MIT Press, Cambridge (1998)
|
| [92] |
SteinMLInterpolation of Spatial Data: Some Theory for Kriging. Springer Series in Statistics, 1999New YorkSpringer.
|
| [93] |
SuttonCDRaoCR, WegmanEJ, SolkaJL. 11—Classification and regression trees, bagging, and boosting. Data Mining and Data Visualization. Handbook of Statistics, 2005AmsterdamElsevier303-329. 24
|
| [94] |
Tenza-AbrilAJ, VillacampaY, SolakAM, Baeza-BrotonsF. Prediction and sensitivity analysis of compressive strength in segregated lightweight concrete based on artificial neural network using ultrasonic pulse velocity. Constr. Build. Mater., 2018, 189: 1173-1183.
|
| [95] |
The UCI Machine Learning Repository: Bias correction of numerical prediction model temperature forecast (2020). https://doi.org/10.24432/C59K76
|
| [96] |
ThorarinsdottirTL, JohnsonMS. Probabilistic wind gust forecasting using nonhomogeneous Gaussian regression. Mon. Weather Rev., 2012, 1403889-897.
|
| [97] |
TibshiraniR. Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B Methodol., 2018, 581267-288.
|
| [98] |
Wan, E.A., Van Der Merwe, R.: The unscented Kalman filter for nonlinear estimation. In: Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium (Cat. No.00EX373), pp. 153–158 (2000). https://doi.org/10.1109/ASSPCC.2000.882463
|
| [99] |
WillmottCJ, MatsuuraK. Advantages of the mean absolute error (MAE) over the root mean square error (RMSE) in assessing average model performance. Clim. Res., 2005, 30: 79-82.
|
| [100] |
YaoN, YeJ, WangS, YangS, LuY, ZhangH, YangX. Bias correction of the hourly satellite precipitation product using machine learning methods enhanced with high-resolution WRF meteorological simulations. Atmos. Res., 2024, 310. 107637
|
| [101] |
ZhangF, O’DonnellLJMechelliA, VieiraS. Chapter 7—Support vector regression. Machine Learning, 2020CambridgeAcademic Press123-140.
|
| [102] |
Zhang, G.: Machine learning for the bias correction of LDAPS air temperature prediction model. In: 2021 6th International Conference on Machine Learning Technologies. ICMLT ’21, pp. 1–6. Association for Computing Machinery, New York (2021). https://doi.org/10.1145/3468891.3468892
|
| [103] |
Zhang, H., Chen, J., Wang, Y., Han, J., Xu, Y.: Improving 2 m temperature forecasts of numerical weather prediction through a machine learning-based Bayesian model. Meteorol. Atmos. Phys. 137, 9 (2025). https://doi.org/10.1007/s00703-024-01056-6
|
| [104] |
ZhongS, ZhangK, BagheriM, BurkenJG, GuA, LiB, MaX, MarroneBL, RenZJ, SchrierJ, ShiW, TanH, WangT, WangX, WongBM, XiaoX, YuX, ZhuJ-J, ZhangH. Machine learning: new ideas and tools in environmental science and engineering. Environ. Sci. Technol., 2021, 551912741-12754.
|
| [105] |
Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. J. R. Stat. Soc. Ser. B Methodol. 67(2), 301–320 (2005). https://doi.org/10.1111/j.1467-9868.2005.00503.x
|
Funding
Ministerio de Ciencia e Innovación(PID2021-123627OB-C55)
ValgrAI - Valencian Graduate School and Research Network for Artificial Intelligence and Generalitat Valenciana
Universidad de Alicante
RIGHTS & PERMISSIONS
The Author(s)