PSI - Issue 60
Balaji Srinivasan et al. / Procedia Structural Integrity 60 (2024) 418–432 Balaji Srinivasan et al. / Structural Integrity Procedia 00 (2019) 000 – 000
422
5
S.No
Notation
Stress factors (SF)
11 Memb. + Bending inside surface for the branch due to axial load 12 Memb. + Bending inside surface for the branch due to in-plane moment 13 Memb. + Bending inside surface for the branch due to out-of-plane moment 14 Memb. + Bending inside surface for the branch due to torsional moment 15 Memb. + Bending inside surface stress factor for the branch due to pressure Memb. + Bending inside surface for the header due to axial load 17 Memb. + Bending inside surface for the header due to in-plane moment 18 Memb. + Bending inside surface for the header due to out-of-plane moment 19 Memb. + Bending inside surface for the header due to torsional moment 20 Memb. + Bending inside surface stress factor for the header due to pressure Memb. + Bending outside surface for the branch due to axial load 22 Memb. + Bending outside surface for the branch due to in-plane moment 16 21
B-Ins-Axial B-Ins-Inplane B-Ins-OutPlane B-Ins-Torsion B-Ins-Pressure H-Ins-Axial H-Ins-Inplane H-Ins-OutPlane H-Ins-Torsion H-Ins-Pressure B-Out-Axial B-Out-Inplane
23 Memb. + Bending outside surface for the branch due to out-of-plane moment B-Out-OutPlane 24 Memb. + Bending outside surface for the branch due to torsional moment B-Out-Torsion 25 Memb. + Bending outside surface stress factor for the branch due to pressure B-Out-Pressure 26 Memb. + Bending outside surface for the header due to axial load H-Out-Axial 27 Memb. + Bending outside surface for the header due to in-plane moment H-Out-Inplane 28 Memb. + Bending outside surface for the header due to out-of-plane moment H-Out-OutPlane 29 Memb. + Bending outside surface for the header due to torsional moment H-Out-Torsion 30 Memb. + Bending outside surface stress factor for the header due to pressure H-Out-Pressure
3.3. Machine Learning Techniques for Stress Factor Predictions: Random Forest (RF) (Mohanty et al. 2022) and Extreme Gradient Boost (XGB) are popular and powerful machine-learning algorithms for various tasks, including classification and regression problems (Gang et al. 2020). RF is an ensemble method that constructs multiple decision trees during training and aggregates their predictions through averaging (regression) or voting (classification) to arrive at the final output. RF excels in handling large datasets, has a low risk of overfitting, and provides feature importance for feature selection (Zahi et al. 2022). On the other hand, XGB is an optimized gradient-boosting implementation that sequentially builds weak learners (shallow decision trees) to correct previous errors. It prevents overfitting, boasts high predictive accuracy, faster training times, and supports parallel computing, making it ideal for large datasets (Wengang Z. et al 2022) The STP data is used to evaluate stress factors using XGB, the Random Forest Regression (RFR) model is trained using Google Colaboratory, and the comparative study is undertaken. 3.3.1. Regression Model Evaluation: R 2 and MAE Metrics Regression model performance is evaluated using two essential metrics: R-squared (R 2 ) and Median Absolute Error (MAE). These metrics provide insights into how well the model fits the data and predicts future outcomes. R squared is a statistical measure indicating the extent to which the model's independent variables explain the variance in the dependent variable. Its values range from 0 to 1, with 0 implying no explanation of variance and 1 denoting a perfect fit. However, R-squared alone doesn't reflect prediction accuracy. In contrast, MAE measures the median absolute difference between predicted and actual values, making it robust against outliers. While R-squared assesses model fit, MAE gauges the precision of predictions, both playing vital roles in evaluating regression model performance. 4. Results and Discussions 4.1. Results 4.1.1. Performance indices for the XGB and RFR models The R-squared metrics score of XGB was found to be significantly dominant over the RFR metrics score. Table 5 shows the comparison of stress factors predicted using XGB and RFR models for a typical input data set (d m /D m , t/T, d m /t) from STP. The R-Squared error and MAE score for branch and header are segregated and presented
Made with FlippingBook Learn more on our blog