Investigation of Laser Ablation Quality Based on Data Science and Machine Learning XGBoost Classifier

This work proposes a matching data science approach for the laser ablation quality, <i>r<sub>eb</sub></i>, the study of Si<sub>3</sub>N<sub>4</sub> film based on supervised machine learning classifiers in the CMOS-MEMS process. The study demonstrates t...

Full description

Bibliographic Details
Main Authors: Chien-Chung Tsai, Tung-Hon Yiu
Format: Article
Language:English
Published: MDPI AG 2023-12-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/14/1/326
Description
Summary:This work proposes a matching data science approach for the laser ablation quality, <i>r<sub>eb</sub></i>, the study of Si<sub>3</sub>N<sub>4</sub> film based on supervised machine learning classifiers in the CMOS-MEMS process. The study demonstrates that there exists an energy threshold, <i>E<sub>th</sub></i>, for laser ablation. If the laser energy surpasses this threshold, increasing the interval time will not contribute significantly to the recovery of pulse laser energy. Thus, <i>r<sub>eb</sub></i> enhancement is limited. When the energy is greater than 0.258 mJ, there exists a critical value of interval time at which the <i>r<sub>eb</sub></i> value is relatively low for each energy level, respectively. In addition, the variation of <i>r<sub>eb</sub></i>, Δ<i>r<sub>eb</sub></i>, is independent of the interval time at the invariant point of energy between 0.32 mJ and 0.36 mJ. Energy and interval time exhibit a Pearson correlation of 0.82 and 0.53 with <i>r<sub>eb</sub></i>, respectively. To maintain Δ<i>r<sub>eb</sub></i> below 0.15, green laser ablation of Si<sub>3</sub>N<sub>4</sub> at operating energies of 0.258–0.378 mJ can adopt a baseline interval time of the initial baseline multiplied by 1/∜2. Additionally, for operating energies of 0.288–0.378 mJ during Si<sub>3</sub>N<sub>4</sub> laser ablation, Δ<i>r<sub>eb</sub></i> can be kept below 0.1. With the forced partition methods, namely, the k-means method and percentile method, the XGBoost (v 2.0.3) classifier maintains a competitive accuracy across test sizes of 0.20–0.40, outperforming the machine learning algorithms Random Forest and Logistic Regression, with the highest accuracy of 0.78 at a test size of 0.20.
ISSN:2076-3417