PSI - Issue 42

Chahboub Yassine et al. / Procedia Structural Integrity 42 (2022) 1025–1032 Author name / Structural Integrity Procedia 00 (2019) 000 – 000

1027

3

.

.

. n g

f

f f = +

(5)

Where the components are further formulated as follows: (1 ) pl g f f tr  = −

(6)

.

pl

f

A  =

(7)

n

M

(8)

  

  

pl   M N

f

− 

exp 1/ 2  −

A

=

n

 

S

2

S

N

n

In which N is the voids nucleation mean quantity, fn is the volume ratio of the second phase particles (responsible for the voids nucleation), and Ꜫ N is the mean strain at the time of voids nucleation. pl tr  the volume plastic strain rate, S

So eight parameters have to be defined: 1 2 0 ( , , , , , , , ) c n f N N q q f f f f S    =

The values of q 1 and q 2 are almost constant based on several studies related to the determination of GTN parameters for different parameters, as shown the below table1. q 1 =1, q 2 =1.5.

Table 1: Initial values of GTN parameters References q 1 q 2

E N

S N

f 0

f c

f n

f f

Material

Bauvineau et al.(1996)(48)

1.5

1

-

-

0.002

0.004

-

-

CMn Steel

Decamp

et

1.5

1

-

-

0.0023

0.004

-

0.225

CMn Steel

al.(1997)(49)

Schmitt

et

1.5

1

0.3

0.1

0

0.06

0.002

0.212

Ferritic base Steel CMn Steel

al.(1997)(50) Skallerud and Zhang .(1997)(51) Benseddiq and Imad .(2008)(52)

1.25

1

0.3

0.1

0.0003

0.026

0.006

0.15

1.5

1

0.3

0.1

0

0.004 0.06

0.002 0.02

~0.2

1.2. Backpropagation approach An Artificial Neural Network (ANN) is a mathematical model that simulates biological neural networks in terms. Backpropagation of error is one of the ANN's advantages since it allows the network to be taught to minimize error up to a certain level of accuracy. The schema of a neural network is determined by the network topology and two additional parameters: the transfer function and the learning process. The training procedure for the network can be customized to meet the needs of supervised and unsupervised training. The input layer, hidden layer, and output layer make up the architecture of the ANN model. Each neuron receives total outputs from all neurons, as indicated in the hidden layer diagram below. We used Matlab 2018's nnstart program to train our Neural network utilizing a database of sixty simulations using the notch tensile test, as shown in Fig.1. The backpropagation method usually involves finding the parameters that make the difference between a finite element method-predicted response and the experimental response as small as possible. Back propagation-trained multilayer perception is one of the most common and adaptable types of neural networks, and it can handle non-linear

Made with FlippingBook - Online catalogs