
High-dimensional Bayesian optimization for metamaterial design
Zhichao Tian1,2, Yang Yang3, Sui Zhou2, Tian Zhou2, Ke Deng4, Chunlin Ji2(), Yejun He1, Jun S. Liu5
Materials Genome Engineering Advances ›› 2024, Vol. 2 ›› Issue (4) : e79.
High-dimensional Bayesian optimization for metamaterial design
Metamaterial design, encompassing both microstructure topology selection and geometric parameter optimization, constitutes a high-dimensional optimization problem, with computationally expensive and time-consuming design evaluations. Bayesian optimization (BO) offers a promising approach for black-box optimization involved in various material designs, and this work presents several advanced techniques to adapt BO to address the challenges associated with metamaterial design. First, variational autoencoders (VAEs) are employed for efficient dimensionality reduction, mapping complex, high-dimensional metamaterial microstructures into a compact latent space. Second, mutual information maximization is incorporated into the VAE to enhance the quality of the learned latent space, ensuring that the most relevant features for optimization are retained. Third, trust region-based Bayesian optimization (TuRBO) dynamically adjusts local search regions, ensuring stability and convergence in high-dimensional spaces. The proposed techniques are well incorporated with conventional Gaussian processes (GP)-based BO framework. We applied the proposed method for the design of electromagnetic metamaterial microstructures. Experimental results show that we achieve a significantly high probability of finding the ground-truth topology types and their geometric parameters, leading to high accuracy in matching the design target. Moreover, our approach demonstrates significant time efficiency compared with traditional design methods.
high-dimensional bayesian optimization / metamaterial design / mutual information maximization / surrogate modeling / trust region bayesian optimization / variational autoencoders
1 | Bonfanti S, Hiemer S, Zulkarnain R, Guerra R, Zaiser M, Zapperi S. Computational design of mechanical metamaterials. Nat Comput Sci. 2024;4(8):574-583. |
2 | Chen Z, Ogren A, Daraio C, Brinson LC, Rudin C. How to see hidden patterns in metamaterials with interpretable machine learning. Extreme Mech Lett. 2022;57:101895. |
3 | Liu P, Chen L, Chen ZN. Prior-knowledge-guided deep-learningenabled synthesis for broadband and large phase shift range metacells in metalens antenna. In: IEEE Trans Antennas Propag. Vol 70;2022:5024-5034. |
4 | Liu W, Xu G, Fan W, Lyu M, Xia Z. Machine-learning-based characterization and inverse design of metamaterials. Materials. 2024;17(14):3512. |
5 | Hebbal A, Balesdent M, Brevault L, Melab N, Talbi EG. Deep Gaussian process for multi-objective Bayesian optimization. Optim Eng. 2023;24(3):1809-1848. |
6 | Diessner M, O’Connor J, Wynn A, et al. Investigating Bayesian optimization for expensive-to-evaluate black box functions: application in fluid dynamics. Front Appl Math Stat. 2022;8:1076296. |
7 | Hernández-Lobato J, Hoffman M, Ghahramani Z. Bayesian optimization in high-dimensional design spaces. In: Proceedings of the International Conference on Machine Learning;2017:1150-1159. |
8 | Zhao J, Yang R, Qiu S, et al. Unleashing the potential of acquisition functions in high-dimensional Bayesian optimization. arXiv, Preprint. Published online, 2023 |
9 | Hou F, Zhao Y, Zhang S, et al. Compression mapping based Bayesian optimization for the design of frequency selective surface. In: Presented at:2021 Photonics & Electromagnetics Research Symposium (PIERS);2021:1768-1775. |
10 | Qin FF, Liu ZZ, Zhang Q, Zhang H, Xiao JJ. Mantle cloaks based on the frequency selective metasurfaces designed by Bayesian optimization. Sci Rep. 2018;8(1):14033. |
11 | Kuszczak I, Azam FI, Bessa MA, Tan P, Bosi F. Bayesian optimisation of hexagonal honeycomb metamaterial. Extreme Mech Lett. 2023;64:102078. |
12 | Qin F, Zhang D, Liu Z, Zhang Q, Xiao J. Designing metal-dielectric nanoantenna for unidirectional scattering via Bayesian optimization. Opt Express. 2019;27(21):31075-31086. |
13 | Wray PR, Paul EG, Atwater HA. Optical filters made from random metasurfaces using Bayesian optimization. Nanophotonics. 2024;13(2):183-193. |
14 | Tran A, Tran M, Wang Y. Constrained mixed-integer Gaussian mixture Bayesian optimization and its applications in designing fractal and auxetic metamaterials. Struct Multidiscip Optim. 2019;59(6):2131-2154. |
15 | Liu B, Ji CL. Automated metamaterial design with computer model emulation and Bayesian optimization. Appl Mech Mater. 2014;575:201-205. |
16 | Patel R, Roy K, Choi J, Han KJ. Generative design of electromagnetic structures through Bayesian learning. IEEE Trans Magn. 2018;54(3):1-4. |
17 | Eriksson D, Pearce M, Gardner JR, et al. Scalable global optimization via local Bayesian optimization. In: Advances in Neural Information Processing Systems (NeurIPS);2019. |
18 | Frazier PI. A tutorial on Bayesian optimization. arXiv. 2018. |
19 | Brochu E, Cora VM, Freitas N. A tutorial on Bayesian optimization of expensive cost functions, with application to active user modelling and hierarchical reinforcement learning. arXiv. 2010. |
20 | Urteaga I, Draïdia MZ, Lancewicki T, et al. Gaussian process Thompson sampling for Bayesian optimization of dynamic maskingbased language model pre-training. arXiv. 2021. |
21 | Doersch C. Tutorial on variational autoencoders. arXiv. 2016. |
22 | Blei DM, Kucukelbir A, McAuliffe JD. Variational inference: a review for statisticians. J Am Stat Assoc. 2017;112(518):859-877. |
23 | Guo B, Liu X, Zhang Y, Wang Y, Zhang S. High-dimensional Bayesian design optimization over mixed variables. In: Tan J, Liu Y, Huang HZ, Yu J, Wang Z, eds. Advances in Mechanical Design. Mechanisms and Machine Science. Vol 155. Springer;2024:773-793. |
24 | Tishby N, Pereira FC, Bialek W. The Information Bottleneck Method;2000. arXiv preprint. |
25 | Uğur Y, Arvanitakis G, Zaidi A. Variational information bottleneck for unsupervised clustering: deep Gaussian mixture embedding. Entropy. 2020;22(2):213. |
26 | Alemi AA, Fischer I, Dillon JV, Murphy K. Deep variational information bottleneck. In: Presented at: International Conference on Learning Representations (ICLR);2017. |
27 | Lewandowsky J, Bauch G. Theory and application of the information bottleneck method. Entropy. 2024;26(3):187. |
28 | Belghazi MI, Baratin A, Rajeswar S, et al. MINE: mutual information neural estimation. arXiv. 2018. Preprint. |
29 | van den Oord A, Li Y, Vinyals O. Representation learning with contrastive predictive coding. arXiv. 2018. Preprint. |
30 | Song J, Ermon S. Understanding the limitations of variational mutual information estimators. In: International Conference on Learning Representations;2020. |
31 | Donsker MD, Varadhan SRS. Asymptotic evaluation of certain Markov process expectations for large time, I. Commun Pure Appl Math. 1975;28(1):1-47. |
32 | Poole B, Ozair S, Van Den Oord A, Alemi A, Tucker G. On variational bounds of mutual information. In: International Conference on Machine Learning. PMLR;2019:5171-5180. |
33 | Nguyen X, Wainwright MJ, Jordan MI. Estimating divergence functionals and the likelihood ratio by convex risk minimization. IEEE Trans Inf Theor. 2010;56(11):5847-5861. |
34 | Nowozin S, Cseke B, Tomioka R. f-GAN: training generative neural samplers using variational divergence minimization. arXiv. 2016. |
35 | Yuan Y. A review of trust region algorithms for optimization. Presented at ICIAM. 2000:271-282. |
36 | Van der Maaten L, Hinton G. Visualizing data using t-SNE. J Mach Learn Res. 2008;9:2579-2605. |
37 | William Sealy Gosset (Student). The probable error of a mean. Biometrika. 1908;6(1):1-25. |
/
〈 |
|
〉 |