Issue 70
T. Pham-Bao et alii, Frattura ed Integrità Strutturale, 70 (2024) 55-70; DOI: 10.3221/IGF-ESIS.70.03
The solution of Eqn. (10) can be determined through the impulse response function h(t) and the damping frequency 2 1 dr r , as follows:
) ( ) h d
r q t
( f t r
( )
r
(11)
r r
e
( )
dr
h
sin
r
dr
After substituting q r ( t ) in Eqn. (11) into formula (8), we obtain the vibration response of the beam under the moving load:
1
( ) r
( ) r r
w x t
( x f t
e
sin ( ) d
(12)
( , )
)
r
dr
r
1
dr
The average value ( m w ) and variance ( w ) of the response signal at position x i on the beam are determined by calculating the average and variance of Eqn. (13). ( ) ( , ) [ ( , )] ( ) sin ( ) r r i x m x t Ew x t m v t e d
1 r dr r
w i
i
p
dr
(13)
1
x
r r
2
2
2
2 2 P r
2 sin (
r
) dr d
( , ) x t
( v t
e
)
w i
i
2
r dr
(
)
r
1
where m p and P are the mean value and standard deviation of the input force. We observe that the mean and variance of the vibration response depend on the natural frequency (corresponding to stiffness) and the mode shape function. When damage occurs at position x i , it alters the m w and σ w values. The study suggests using a correlation coefficient between the mean value and variance of the response signal for damage identification.
E w m w m
i
w
j
w
i
j
( , ), ( , ) R w x t w x t
(14)
i
j
w w
i
j
Artificial neural network (ANN) Artificial neural networks (ANNs) have gained prominence as powerful computational models to identify non-linear features. A feedforward neural network (FNN) is one of the most widely used machine learning architectures due to its simplicity and effectiveness in learning complex mappings between input and output. A FNN consists of an input layer, one or more hidden layers, and an output layer. The connections between neurons in a layer are determined by their associated weights, which determine how strong the connection is between neurons. Using a series of weighted summations and activation functions, an FNN can compute its output y based on an input vector x. Activation functions introduce nonlinearity into the neural network, allowing it to understand complex patterns. Common activation functions include sigmoid, hyperbolic tangent, rectified linear unit, purelin, and softmax functions. Activation functions and weighted summations determine the output y of an according to the formula:
ji i w x +b
(15)
y =f
j
ac
j
where for each neuron, x i represents its inputs, y j represents its outputs, and b j represents its biases. w ij is the weight coefficient, the notation "ji" indicates that the input is referred to is i , and the neuron being referred to is j. f ac is an activation function.
59
Made with FlippingBook Digital Publishing Software