Unveiling the relationship between Fabry-Perot laser structures and optical field distribution via symbolic regression

Wenqiang Li , Min Wu , Weijun Li , Meilan Hao , Lina Yu

Optoelectronics Letters ›› 2025, Vol. 21 ›› Issue (3) : 149 -154.

PDF
Optoelectronics Letters ›› 2025, Vol. 21 ›› Issue (3) : 149 -154. DOI: 10.1007/s11801-025-4064-2
Article

Unveiling the relationship between Fabry-Perot laser structures and optical field distribution via symbolic regression

Author information +
History +
PDF

Abstract

In recent years, machine learning (ML) techniques have been shown to be effective in accelerating the development process of optoelectronic devices. However, as “black box” models, they have limited theoretical interpretability. In this work, we leverage symbolic regression (SR) technique for discovering the explicit symbolic relationship between the structure of the optoelectronic Fabry-Perot (FP) laser and its optical field distribution, which greatly improves model transparency compared to ML. We demonstrated that the expressions explored through SR exhibit lower errors on the test set compared to ML models, which suggests that the expressions have better fitting and generalization capabilities.

Cite this article

Download citation ▾
Wenqiang Li, Min Wu, Weijun Li, Meilan Hao, Lina Yu. Unveiling the relationship between Fabry-Perot laser structures and optical field distribution via symbolic regression. Optoelectronics Letters, 2025, 21(3): 149-154 DOI:10.1007/s11801-025-4064-2

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Gostimirovic D, Winnie N Y. Automating photonic design with machine learning. 2018 IEEE 15th International Conference on Group IV Photonics (GFP), August 29–31, 2018, Cancun, Mexico, 2018, New York, IEEE 1-2 [C]

[2]

Barth C, Becker C. Machine learning classification for field distributions of photonic modes. Communications physics, 2018, 1(1): 58 J]

[3]

Karanov B, Chagnon M, Thouin F, et al.. End-to-end deep learning of optical fiber communications. Journal of lightwave technology, 2018, 36(20): 4843-4855 J]

[4]

Tahersima M H, Kojima K, Koike-Akino T, et al.. Deep neural network inverse design of integrated photonic power splitters. Scientific reports, 2019, 9(1): 1368 J]

[5]

Baxter J, Calà Lesina A, Guay J M, et al.. Plasmonic colours predicted by deep learning. Scientific reports, 2019, 9(1): 8074 J]

[6]

Chen C L, Mahjoubfar A, Tai L C, et al.. Deep learning in label-free cell classification. Scientific reports, 2016, 6(1): 21471 J]

[7]

Borhani N, Kakkava E, Moser C, et al.. Learning to see through multimode fibers. Optica, 2018, 5(8): 960-966 J]

[8]

Fang Y, Han H B, Bo W B, et al.. Deep neural network for modeling soliton dynamics in the mode-locked laser. Optics letters, 2023, 48: 779-782 J]

[9]

Forrest S. Genetic algorithms: principles of natural selection applied to computation. Science, 1993, 261(5123): 872-878 J]

[10]

Koza J R. Genetic programming as a means for programming computers by natural selection. Statistics and computing, 1994, 4: 87-112 J]

[11]

PETERSEN B K, LARMA M L, MUNDHENK T N, et al. Deep symbolic regression: recovering mathematical expressions from data via risk-seeking policy gradients[EB/OL]. (2019-12-10) [2024-01-23]. https://arxiv.org/abs/1912.04871.

[12]

LI W, LI W, YU L, et al. A neural-guided dynamic symbolic network for exploring mathematical expressions from data[EB/OL]. (2023-09-24) [2024-01-23]. https://arxiv.org/abs/2309.13705.

[13]

LIPTON Z C, BERKOWITZ J, ELKAN C. A critical review of recurrent neural networks for sequence learning[EB/OL]. (2015-05-29) [2024-01-23]. https://arxiv.org/abs/1506.00019.

[14]

MUNDHENK T N, LANDAJUELA M, GLATT R, et al. Symbolic regression via neural-guided genetic programming population seeding[EB/OL]. (2021-10-29) [2024-01-23]. https://arxiv.org/abs/2111.00053.

[15]

LI W, LI W, SUN L, et al. Transformer-based model for symbolic regression via joint supervised learning[J]. International conference on learning representations, 2022.

[16]

Liu J, Li W, Yu L, et al.. SNR: symbolic network-based rectifiable learning framework for symbolic regression. Neural networks, 2023, 165: 1021-1034 J]

[17]

WU M, LI W, YU L, et al. Discovering mathematical expressions through DeepSymNet: a classification-based symbolic regression framework[J]. IEEE transactions on neural networks and learning systems, 2023.

[18]

Obada D O, Okafor E, Abolade S A, et al.. Explainable machine learning for predicting the band gaps of ABX3 perovskites. Materials science in semiconductor processing, 2023, 161: 107427 J]

[19]

Manti S, Svendsen M K, Knøsgaard N R, et al.. Exploring and machine learning structural instabilities in 2D materials. NPJ computational materials, 2023, 9(1): 33 J]

[20]

Willhelm D, Wilson N, Arroyave R, et al.. Predicting van der Waals heterostructures by a combined machine learning and density functional theory approach. ACS applied materials & interfaces, 2022, 14(22): 25907-25919 J]

[21]

Huang P, Lukin R, Faleev M, et al.. Unveiling the complex structure-property correlation of defects in 2D materials based on high throughput datasets. NPJ 2D materials and applications, 2023, 7(1): 6 J]

[22]

Breiman L. Random forests. Machine learning, 2001, 45: 5-32 J]

[23]

DRUCKER H, BURGES C J, KAUFMAN L, et al. Support vector regression machines[J]. Advances in neural information processing systems, 1996, 9.

[24]

Quinlan J R. C4.5: programs for machine learning, 2014, Amsterdam, Elsevier [M]

[25]

Rumelhart D E, Hinton G E, Williams R J. Learning representations by back-propagating errors. Nature, 1986, 323(6088): 533-536 J]

[26]

Pedregosa F, Varoquaux G, Gramfort A, et al.. Scikit-learn: machine learning in Python. Journal of machine learning research, 2011, 12: 2825-2830 [J]

[27]

VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[J]. Advances in neural information processing systems, 2017, 30.

[28]

Lecun Y, Touresky D, Hinton G, et al.. A theoretical framework for back-propagation. Proceedings of the connectionist models summer school san mateo ca, 1988, 1: 21-28 [J]

RIGHTS & PERMISSIONS

Tianjin University of Technology

AI Summary AI Mindmap
PDF

211

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/