Laser-induced breakdown spectroscopy (LIBS) is a spectroscopic analytic technique with great potentials because of its unique advantages for fast, online, and in situ measurement. The quantification results of traditional physical principle-based calibration model are unsatisfactory since these models were not able to compensate for complicate matrix effects as well as signal fluctuation. Machine learning can intelligently correlate complex LIBS spectral data with analysis results, it can realize data preprocessing such as spectral selection, variable reconstruction, denoise and de-interference, and can suppress the influence of signal fluctuations, matrix effects and self-absorption effects in the modeling. Machine learning has become an inevitable choice for improving the performance of qualitative and quantitative LIBS. The researchers need to address in their future study on solving the problems of LIBS analysis using machine learning algorithms, such as restrictions on training data, the disconnect between physical principles and algorithms, the low generalization ability and massive data processing ability of the model. For more details, please see the article entitled “Machine learning in laser-induced breakdown spectroscopy: A review” by Zhongqi Hao, et al., Front. Phys. 19(6), 62501 (2024). [Photo credits: Zhongqi Hao at Nanchang Hangkong University & Zhe Wang at Tsinghua University]
Download cover