Uncertainty estimation affects predictor selection and its calibration improves materials optimization
Yuan Jiang , Jinshan Li , Tinghuan Yuan , Jun Wang , Bin Tang , Xinping Mao , Gang Li , Ruihao Yuan
Journal of Materials Informatics ›› 2026, Vol. 6 ›› Issue (1) : 4
Uncertainty estimation affects predictor selection and its calibration improves materials optimization
Uncertainty is crucial when the available data for building a predictor are insufficient, which is ubiquitous in machine-learning-driven materials studies. However, the impact of uncertainty estimation on predictor selection and materials optimization remains incompletely understood. Here, we demonstrate that in active learning, uncertainty estimation significantly influences predictor selection, as well as that the calibration of uncertainty estimation can improve the optimization. The idea is validated on three alloy datasets (Ni-based, Fe-based, and Ti-based) using three commonly used algorithms - support vector regression (SVR), neural networks (NN), and extreme gradient boosting (XGBoost) - which yield comparable predictive accuracy. It is shown that XGBoost presents more reliable uncertainty estimation than SVR and NN. Using the directly estimated uncertainty for the three predictors with similar accuracy, we find that the optimization is quite different. This suggests that uncertainty estimation plays a role in predictor selection. The uncertainty estimation is then calibrated to improve reliability, and its effect on optimization is compared with the uncalibrated case. Among the nine cases considered (three models and three datasets), eight show improved optimization when calibrated uncertainty estimation is used. This work suggests that uncertainty estimation and its calibration deserve greater attention in active learning-driven materials discovery.
Uncertainty estimation, uncertainty calibration, active learning / optimization
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
|
| [5] |
|
| [6] |
|
| [7] |
|
| [8] |
|
| [9] |
|
| [10] |
|
| [11] |
|
| [12] |
|
| [13] |
|
| [14] |
|
| [15] |
|
| [16] |
|
| [17] |
|
| [18] |
|
| [19] |
|
| [20] |
|
| [21] |
|
| [22] |
|
| [23] |
|
| [24] |
|
| [25] |
|
| [26] |
|
| [27] |
|
| [28] |
|
| [29] |
|
| [30] |
|
| [31] |
|
| [32] |
|
| [33] |
|
| [34] |
|
| [35] |
|
| [36] |
|
| [37] |
|
| [38] |
|
| [39] |
|
| [40] |
|
| [41] |
|
| [42] |
|
| [43] |
|
| [44] |
|
| [45] |
|
| [46] |
|
| [47] |
|
| [48] |
|
| [49] |
|
| [50] |
|
/
| 〈 |
|
〉 |