Issue 67

S. Chinchanikar et alii, Frattura ed Integrità Strutturale, 67 (2023) 176-191; DOI: 10.3221/IGF-ESIS.67.13

for the cutting speed and machining time. The generation of higher cutting temperatures, which increase flank wear rate, may be the cause of the rise in flank wear with higher cutting speeds and cutting duration.

Figure 12: Flank wear at different machining times varying with cutting speed, feed, and depth of cut.

Artificial neural network (ANN) flank wear growth model A computer method called an artificial neural network (ANN) can simulate connections between input parameters and output responses. Multilayer perceptron, or MLP, is the name of a fully linked multi-layer neural network. A typical feedforward artificial neural network is an MLP. The direction of information flow in feedforward propagation is forward. In the hidden layer, the input is utilized to create an intermediate function, which is subsequently used to generate an output. Fig. 13 is an example of a common MLP design that is in use. MLP is characterized by three different layers namely input layer, hidden layer, and output layer which consist of an interconnected group of artificial neurons. The number of neurons present in the input layer and output layer is equal to the number of input variables and corresponding output values.

Figure 13: Typical ANN architecture. The domain provides raw input to the input layer. At this layer, no calculation is done. Here, neurons only transmit data to the hidden layer. The neurons that make up the hidden layer are opaque to the outside world and offer the neural network abstraction. The characteristics entered through the input layer are subjected to various computations by the hidden layer, which then forwards the result to the output layer. The last layer of the network, known as the output layer, is responsible for bringing the knowledge gained from the hidden layer together and producing the desired outcome. Most of the time, all buried layers have the same activation function. However, the output layer's activation function typically varies from the hidden layer's activation function. The model's choice depends on its goal or type of prediction. The network must be trained if output predictions are to be made with greater accuracy. An ordered modification of the network's synaptic weights occurs during a model's training phase in order to get the desired result. The error backpropagation method is the most often used training algorithm. The initialization of the weights and thresholds occurs in the first phase of a standard ANN algorithm. Following that, each neuron's output is computed based on its input data and initialization weights, which results in the network's final output prediction. The weights are then adjusted based on the output node error, which is determined next. Additionally, back-propagating faults computed at output layer nodes are used to alter the weights in the preceding layers [8]. The training ends when the ANN output is sufficiently close to the predicted output for each set, and this procedure is repeated for each pair of input and output training data. Training of an ANN flank wear growth model A neural network is a supervised machine learning algorithm to solve classification or regression problems with different sizes and depths. However, one need to preprocess input features, initialize the weights, add bias if needed, and choose

183

Made with FlippingBook Learn more on our blog