PSI - Issue 48
Behrooz Keshtegar et al. / Procedia Structural Integrity 48 (2023) 348–355 Keshtegar et al / Structural Integrity Procedia 00 (2019) 000 – 000
351
4
g( ) 5
, is the penalty factor which is a positive value defined as
.( g( ) is the
1 { ( )} X F x
u
where
performance function at mean point ) and P( ) X is the penalty function, given as
g( ) g( ) 0 0 g( ) 0 X X X
P( ) X
(7)
The non-gradient method without sensitivity vector can be applied for this refined FORM formulation to determine the MPP. The PSO and HS evolutionary and meta- heuristics are used to optimize the probabilistic model in Eq. (5). 3.1. Refined FORM-based PSO The PSO is a powerful meta-heuristic algorithm that is widely used in engineering and machine learning techniques to solve optimization problems (Ben Seghier et al. 2020; Seghier et al. 2021). PSO is iteratively formulated using two populations i.e. the best population in the current iteration ( p best ) and the best population of each partial ( g best ), and the position of each relation X is adapted based on the velocity of particles (i.e. V ), simultaneously. In addition, the velocity and position of the particles are adjusted from search iteration k to k+1 by p best and g best as follows (Shi 2001):
1 1 c r p [
] X
2 2 c r g [
]
V V
X
k k
1
k
best
k
best
k
(8)
X X V
1
1
k
k
k
where, c 1 and c 2 are the acceleration coefficients, r 1 and r 2 are two random numbers sampled from uniform distribution in the range from 0 to 1, ω is the inertia weight to balance the global and local partial search, as below:
NI NI k
(
k ), min
1,2,...,
NI
min
(9)
max
k
In PSO, the maximum ( max
min i v ) velocities are respectively computed based on the standard min i v = - σ i /10 for the initial velocity of each particle. The
i v ) and minimum (
max i v = σ i /10 and
deviation ( σ ) of the random variable x i as
0.9 max .
parameters of PSO in this study are given as c 1 =c 2 = 2, min =0.4 and
3.1. Refined FORM-based HS optimization The HS is based on the musical process of finding perfect harmony and is well suited to solving complex engineering problems such as optimizing the parameters of machine learning models for prediction problems (Keshtegara and Seghier 2018) . The HS algorithm was proposed by Geem et al. (2001) (Zong Woo Geem, Joong Hoon Kim, and Loganathan 2001) based on a random process to find the optimum by the following adjusting process:
j
x
r HMCR otherwise 1
i
j inew
x
(10)
L
U i
L
x r x x (
)
i
i
j inew
r 3 (2 1)
x x
bw r PAR
2
i
j inew
x
(11)
j
otherwise
i
Made with FlippingBook Annual report maker