PSI - Issue 70
Abdul Musavvir et al. / Procedia Structural Integrity 70 (2025) 432–439
436
The R2 Score of 0.5165, Mean Squared Error of 65.32, Mean Absolute Error of 5.65 and Root Mean Squared Error of 8.08 are the results of Linear Regression, Decision tree regression was second least with R2 Score of 0.6228, Mean Squared Error of 50.15, Mean Absolute Error of 4.10 and Root Mean Squared Error of 7.08, while Decision trees are prone to overfitting because they can create highly specific splits for training data, leading to poor generalization. The XGBoost Regression (Sara Elhishi, 2023) shows highest R2 Score of 0.83 less Mean Squared Error of 23.28, Mean Absolute Error of 2.66 and Root Mean Squared Error of 4.82 as due to its optimized gradient boosting framework, which efficiently builds decision trees in parallel, minimizes bias and overfitting, and leverages advanced techniques like regularization, followed by Random Forest (Mary Devika Bandaru, 2024) (Priscila Silva, 2020). It outperforms Decision Tree Regression and Linear Regression due to its ensemble approach, reducing overfitting and variance. This helps it to maintain good R2 Score of 0.76 less Mean Squared Error of 31.49, Mean Absolute Error of 3.29 and Root Mean Squared Error of 5.6121.
Table 2. Algorithm Results Evaluation R2 Score
MSE
MAE
RMSE 4.8253 5.6121 7.0818 8.0823
RME
XGBoost Regression
0.8571 0.7669 0.6228 0.5165
23.2832 31.4953 50.1515 65.3231
2.6622 3.2977 4.1011 5.6513
0.0419 0.0519 0.0646
Random Forest
Decision Tree Regression
Linear Regression 0.0890 Mean Squared Error (MSE); Mean Absolute Error (MAE); Root Mean Squared Error (RMSE); Relative Mean Error (RME).
Fig. 3. Scatter plot of Linear Regression & Decision Tree Regression.
Fig. 4. Scatter plot of Random Forest Regression & XGBoost Regression.
Made with FlippingBook - Online catalogs