Physics-Based Active Learning for Design Space Exploration and Surrogate Construction for Multiparametric Optimization
Sergio Torregrosa, Victor Champaney, Amine Ammar, Vincent Herbert, Francisco Chinesta
Physics-Based Active Learning for Design Space Exploration and Surrogate Construction for Multiparametric Optimization
The sampling of the training data is a bottleneck in the development of artificial intelligence (AI) models due to the processing of huge amounts of data or to the difficulty of access to the data in industrial practices. Active learning (AL) approaches are useful in such a context since they maximize the performance of the trained model while minimizing the number of training samples. Such smart sampling methodologies iteratively sample the points that should be labeled and added to the training set based on their informativeness and pertinence. To judge the relevance of a data instance, query rules are defined. In this paper, we propose an AL methodology based on a physics-based query rule. Given some industrial objectives from the physical process where the AI model is implied in, the physics-based AL approach iteratively converges to the data instances fulfilling those objectives while sampling training points. Therefore, the trained surrogate model is accurate where the potentially interesting data instances from the industrial point of view are, while coarse everywhere else where the data instances are of no interest in the industrial context studied.
Active learning (AL) / Artificial intelligence (AI) / Optimization / Physics based
[1.] |
|
[2.] |
Ash, J., Zhang, C., Krishnamurthy, A., Langford, J., Agarwal, A.: Deep batch active learning by diverse, uncertain gradient lower bounds. In: Proceedings of the 8th International Conference on Learning Representations, Addis Ababa, Ethiopia (2020)
|
[3.] |
Azimi, J., Fern, A,. Fern, X., Borradaile, G., Heeringa, B.: Batch active learning via coordinated matching. In: Proceedings of the 29th International Conference on Machine Learning, Edinburgh, Scotland, UK (2012)
|
[4.] |
Beluch, W., Genewein, T., Nurnberger, A., Kohler, J.M.: The power of ensembles for active learning in image classification. In: 2018 IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, pp. 9368–9377 (2018)
|
[5.] |
Bilgic, M., Getoor, L.: Link-based active learning. In: NIPS Workshop on Analyzing Networks and Learning with Graphs (2009)
|
[6.] |
Blitzer, J., Dredze, M., Pereira, F.: Biographies, bollywood, boom-boxes and blenders: domain adaptation for sentiment classification. In: Proceedings of the 45th Annual Meeting of the Association for Computational Linguistics, Prague, Czech Republic (2007)
|
[7.] |
Bloodgood, M., Vijay-Shanker, K.: Taking into account the differences between actively and passively acquired data: the case of active learning with support vector machines for imbalanced datasets. In: Human Language Technologies: Conference of the North American Chapter of the Association of Computational Linguistics, Boulder, Colorado, USA, pp. 137–140 (2009)
|
[8.] |
|
[9.] |
|
[10.] |
|
[11.] |
Dagan, I., Engelson, P.: Committee-based sampling for training probabilistic classifiers. In: Machine Learning, Proceedings of the Twelfth International Conference on Machine Learning, Tahoe City, California, USA, pp. 150–157 (1995)
|
[12.] |
Ducoffe, M., Precioso, F.: Adversarial Active Learning for Deep Networks: a Margin Based Approach. arXiv:1802.09841 (2018)
|
[13.] |
Freytag, A., Rodner, E., Denzler, J.: Selecting influential examples: active learning with expected model output changes. In: Computer Vision-ECCV 2014—13th European Conference, Zurich, Switzerland, pp. 562–577 (2014)
|
[14.] |
Gal, Y., Islam, R., Ghahramani, Z.: Deep bayesian active learning with image data. In: Proceedings of the 34th International Conference on Machine Learning, Sydney, NSW, Australia, vol. 70, pp. 1183–1192 (2017)
|
[15.] |
Geifman, Y., El-Yaniv, R.: Deep Active Learning over the Long Tail. arXiv:1711.00941 (2017)
|
[16.] |
Guo, Y.: Active instance sampling via matrix partition. In: Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems, Vancouver, BC, Canada, pp. 802–810 (2010)
|
[17.] |
Hernandez, Q., Badias, A., Gonzalez, D., Chinesta, F., Cueto, E.: Deep Learning of Thermodynamics-Aware Reduced-Order Models From Data. arXiv:2007.03758 (2020)
|
[18.] |
|
[19.] |
Huang, J., Child, R., Rao, V., Liu, H., Satheesh, S., Coates, A.: Active Learning for Speech Recognition: the Power of Gradients. arXiv:1612.03226 (2016)
|
[20.] |
Ibanez, R., Abisset-Chavanne, E., Ammar, A., González, D., Cueto, E., Huerta, A., Duval, J.L., Chinesta, F.: A multidimensional data-driven sparse identification technique: the sparse proper generalized decomposition. Complexity, 2018, 5608286 (2018). https://doi.org/10.1155/2018/5608286
|
[21.] |
|
[22.] |
Joshi, A., Porikli, F., Papanikolopoulos, N.: Multi-class active learning for image classification. In: 2009 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2372–2379 (2009)
|
[23.] |
|
[24.] |
|
[25.] |
|
[26.] |
Lewis, D., Gale, W.: A Sequential Algorithm for Training Text Classifiers, pp. 3–12 (1994)
|
[27.] |
|
[28.] |
|
[29.] |
|
[30.] |
Nguyen, T., Smeulders, A.: Active learning using pre-clustering. In: ICML, pp. 79–79 (2004)
|
[31.] |
|
[32.] |
|
[33.] |
Ranganathan, H., Venkateswara, H., Chakraborty, S., Panchanathan, S.: Deep active learning for image classification. In: 2017 IEEE International Conference on Image Processing, pp. 3934–3938 (2017)
|
[34.] |
Ren, P., Xiao, Y., Chang, X., Huang, P.-Y., Li, Z., Gupta, B.B., Chen, X., Wang, X.: A Survey of Deep Active Learning. arXiv:2009.00236 (2021)
|
[35.] |
Roy, N., McCallum, A.: Toward optimal active learning through Monte Carlo estimation of error reduction. In: ICML, pp. 441–448 (2001)
|
[36.] |
Sancarlos, A., Cameron, M., Abel, A., Cueto, E., Duval, J.-L., Chinesta, F.: From ROM of electrochemistry to AI-based battery digital and hybrid twin. In: Archives of Computational Methods in Engineering, pp. 1–37 (2020)
|
[37.] |
Sancarlos, A., Champaney, V., Duval, J.L., Cueto, E., Chinesta, F.: PGD-based advanced nonlinear multiparametric regressions for constructing metamodels at the scarce-data limit. arXiv:2103.05358 (2021)
|
[38.] |
Sener, O., Savarese, S.: Active learning for convolutiopnal neural networks: a core-set approach. arXiv:1708.00489 (2018)
|
[39.] |
Settles, B.: Active Learning Literature Survey. Computer Sciences Technical Report 1648, University of Wisconsin-Madison (2010)
|
[40.] |
Settles, B., Craven, M., Ray, S.: Multiple-instance active learning. Adv. Neural Info. Process. Syst. 20, 1289–1296 (2008)
|
[41.] |
Seung, H., Opper, M., Sompolinsky, H.: Query by committee. In: Proceedings of the 5th Annual Workshop on Computational Learning Theory, pp. 287–294 (1992)
|
[42.] |
Shui, C., Zhou, F., Gagne, C., Wang, B.: Deep active learning: unified and principled method for query and training. In: International Conference on Artificial Intelligence and Statistics, PMLR, pp. 1308–1318 (2020)
|
[43.] |
|
[44.] |
Szegedy, C., Zaremba, W., Sutskever, I., Bruna, J., Erhan, D., Ian Goodfellow, I., Fergus, R.: Intriguing Properties of Neural Networks. arXiv:1312.6199 (2014)
|
[45.] |
|
[46.] |
|
[47.] |
Udrescu, S., Tan, A., Feng, J., Neto, O., Wu, T., Tegmark, M.: Ai Feynman 2.0: Pareto-Optimal Symbolic Regression Exploiting Graph Modularity. arXiv:2006.10782 (2006)
|
[48.] |
Yang, Y., Loog, M.: A Benchmark and Comparison of Active Learning for Logistic Regression. arXiv:1611.08618 (2018)
|
[49.] |
Yin, C., Qian, B., Cao, S., et al.: Deep similarity-based batch mode active learning with exploration-exploitation. In: IEEE International Conference on Data Mining, pp. 575–584 (2017)
|
[50.] |
Zhdanov, F.: Diverse Mini-Batch Active Learning. arXiv:1901.05954 (2019)
|
/
〈 | 〉 |