PSI - Issue 72

Oleh Yasniy et al. / Procedia Structural Integrity 72 (2025) 181–187

184

Fig. 2 Architecture of a three-layer neural network by Chen et al. (2022)

Multilayer perceptron’s (MLPs) were built using the Broyden -Fletcher-Goldfarb-Shannon (BFGS) learning algorithm to solve the prediction problem. The sum of squared errors (SOS) was used to evaluate the quality of the model's prediction. The mean absolute percentage error (MAPE) was chosen as the forecasting error, which was calculated using the formula: = 1 ∑ | ( )− . ( )| | ( )| = 1 ∙ 100% (1) The Mean Absolute Percentage Error is a metric used to evaluate the accuracy of predictive models. It measures the average percentage error between the actual (true) and predicted values, expressed as a percentage. This metric is popular because of its simplicity and ease of interpretation. It shows how far off the model's predictions are, on average, in percentage terms. 3. Results and discussion The crack length was predicted at the stress ratio R = 0.03, 0.1, 0.3. The sample contained 154 elements by Liu et al. (2023), of which 70% were randomly selected for the training set, and 30% were left to evaluate the quality of the prediction. It was found that the prediction results agree with the experimental ones. The input data included the number of loading cycles N and the stress ratio R , whereas the output variable was the crack length a . Fig. 3 shows a high correlation between the predicted and experimental values. The points are located close to the bisector of the first coordinate angle, indicating the data consistency. Fig. 4. shows the predicted and experimental dependences of the crack length on the number of loading cycles a-N for R = 0.03, 0.1, 0.3 for the test sample. The error of 0.2% was obtained by the neural network method in the test sample. As it can be seen from Figures 1-2, the absence of significant deviations indicates that the model learns well from the data and does not have significant problems with over- or under-learning. The model also demonstrates high quality crack length prediction, as the difference between the experimental and predicted values is minimal. Generally, when R increases (e.g., from 0.03 to 0.3), the crack grows faster, corresponding to known physical laws. In the second stage, the sample contained 55 elements, of which 80% were selected for the training sample contained experimental dependencies of crack length on the number of loading cycles at R = 0.03; and to test the quality of the prediction, a testing sample was selected, which amounted to 20% at R = 0.03. The input data included the number of loading cycles N at R = 0.03, whereas the output variable was the crack length a . In particular, the parameter of stopping the learning network is the number of epochs, which in this investigation is equal to 1000. Fig. 3 shows the relationship between the experimental values of the crack length and the predicted values . at R = 0.03 for neural networks.

Made with FlippingBook Annual report maker