PSI - Issue 79
Davide D’Andrea et al. / Procedia Structural Integrity 79 (2026) 283–290
289
held constant at its second percentile, is not directly represented in the plots but influences the response surfaces as a “hidden” parameter. SHapley Additive exPlanations (SHAP) is a methodology for interpreting ML model outputs by attributing feature importance scores to individual input variables. Since most ML algorithms operate as black boxes and provide limited transparency regarding the influence of each predictor on model behaviour, SHAP offers a principled framework to quantify and visualize these effects. Specifically, it decomposes a model’s prediction i nto additive contributions from each feature, indicating whether a given variable increases or decreases the predicted value relative to its expected output. A negative SHAP value indicates that the corresponding configuration of the input features drives the model output below its mean value, whereas a positive SHAP value signifies a contribution toward increasing the output relative to its mean. Figure 6 illustrates the SHAP value distributions for each geometric feature considered in the analysis. Consistent with the findings from the correlation matrix, the notch opening angle emerges as the most influential parameter, as evidenced by the largest spread of its SHAP values. In contrast, the dimple radius and its position exhibit comparable levels of importance. The colormap encodes a normalized scale from low to high values for each variable. Due to their inverse correlation with the target output, high values of 2 and r dimple correspond to negative SHAP values, while low values correspond to positive ones. Conversely, high values of the dimple position parameter contribute positively to the objective function.
Figure 6. SHAP Summary Plot.
Feature’s importance (FI) can be calculated as the mean of absolute values of SHAP as reported in equation (4). This analysis gives quantitative information about each variable weight on the predictive model generation. As it can be observed in Figure 7 , notch opening angle presents the highest feature importance, while dimple’s characteristics are characterized by similar feature importance. FI = N 1 ∑ |SHAP i | N i=1 (4)
Figure 7. Feature importance barplot.
4. Conclusions A Local Approach has been employed to investigate the behaviour of DBC substrates across different geometric configurations. The analysis of SED in the vicinity of the bi-material notch allowed the derivation of a single energy based parameter for comparison across all possible configurations. The same SED level can be achieved in multiple ways, which should be evaluated according to geometric constraints dictated by the specific application for which the
Made with FlippingBook - Online catalogs