PSI - Issue 57
Khashayar Shahrezaei et al. / Procedia Structural Integrity 57 (2024) 711–717
713
K. Shahrezaei et al. / Structural Integrity Procedia 00 (2023) 000–000
3
Metamodeling
Start: Design the preliminary Experiment Design (ED)
Observations: Generate ED using space filling algorithm (LHS) and divided data into sets of training and test
1
0.5
Training data set
Testing data set
0
0
0.5
1
Construct surrogates: Estimation of model parameters ANN, PCE, Kriging
GSA
Visualization: Heat map
Add new design(s) point(s)
No
No
Finish : Generate parameter samples for sensitivity analysis using uncertainty profile of the inputs
Gained global accuracy? R², RMSE
Model testing: Estimation of model error; RMSE, R²
Yes
0.5
1
0
0.5
1 0
0.5
1 0
Fig. 1: A schematic illustration of the developed meta-model based global sensitivity analysis
2. Methodology Outline - Metamodel based GSA
The Deep Learning Toolbox of the MATLAB software (The MathWorks, 2022) has been used to develop a multi layer perceptron ANN consisting of di ff erent number hidden layers, with di ff erent numbers of neurons, and one output layer. The performance of the ANN is heavily dependent on the chosen hyperparameters by the users. Example of hyperparameters includes learning rate, regularization strength, number of hidden layers, number of neurons in each hidden layer, and type of transfer function, among others. Hyperparameter optimization was therefore employed to find the most e ffi cient ANN architecture that minimizes the error of the ANN model. Marelli et al. (2022) have de veloped software for tasks within the field of uncertainty quantification, which also o ff ers the construction of various types of metamodels. In this study, the PCE models using Subspace Pursuit (SP) algorithm developed by Dai and Milenkovic (2009) have been employed to compute PCE coe ffi cients. The Kriging model developed in this study is an adaptation of the MATLAB Kriging code developed by Forrester et al. (2008). The reasoning on which these models are selected is because of the proven e ffi ciency of GSA applications and their propriety in dealing with the noise in optimization frameworks. For further discussion on the performance comparison of these metamodels, authors refers to the work by Gratiet et al. (2015). Figure 1 illustrates the developed framework for performing GSA using various types of metamodels. The frame work is illustrated in two steps: metamodeling and GSA. At the stage of metamodeling, all the input parameters that are subject to sensitivity analysis are defined and a preliminary ED is created using a space-filling algorithm such as Latin Hypercube Sampling (LHS). The preliminary ED is divided into two di ff erent data sets, namely training and testing data sets. All the metamodels are constructed using the training data sets, and the testing data set is used to assess the quality of the metamodel. The quality of the metamodels is assessed through two global accuracy measures namely the coe ffi cient of determination, R 2 , and Root Mean Square Error (RMSE). The size of the preliminary ED is increased successively starting with 50 data samples, and once a certain level of global accuracy is reached, the metamodels are passed forward for conducting the GSA. At the stage of the GSA, the uncertainty profile of the input parameters is used to generate samples for the sensitivity analysis. The Sobol’ sensitivity indices (Sobol’, 1990) are subsequently computed using the proposed scheme by Saltelli (2004) and the generated samples.
Made with FlippingBook Ebook Creator