Machine learning efficiently corrects LIBS spectrum variation due to change of laser fluence

2020 
This work demonstrates the efficiency of machine learning in the correction of spectral intensity variations in laser-induced breakdown spectroscopy (LIBS) due to changes of the laser pulse energy, such changes can occur over a wide range, from 7.9 to 71.1 mJ in our experiment. The developed multivariate correction model led to a precise determination of the concentration of a minor element (magnesium for instance) in the samples (aluminum alloys in this work) with a precision of 6.3% (relative standard deviation, RSD) using the LIBS spectra affected by the laser pulse energy change. A comparison to the classical univariate corrections with laser pulse energy, total spectral intensity, ablation crater volume and plasma temperature, further highlights the significance of the developed method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    28
    References
    10
    Citations
    NaN
    KQI
    []