PSI - Issue 62
92 4 Lorenzo Principi et al. / Procedia Structural Integrity 62 (2024) 89–96 Principi L. / Structural Integrity Procedia 00 (2019) 000 – 000 In contrast, once the threshold is surpassed, the function gradually approaches asymptotically 1, with negligible changes in the resulting ( ) values for increasing x. Consequently, this function accentuates differences in low and medium values while compressing the output for values exceeding the predefined threshold. The fixed values corresponding to ( )=0.90 are derived from MIT (2020) and are 1000 meters for Total Length , 25 m for Max Span Length and 10 spans for Number of Spans . Others quantitative features, namely Average Daily Traffic (ADT), Average Daily Trucks Traffic (ADTT), and Allowable Mass , are grouped into the values derived from MIT (2020). Then, values are encoded to fall between 0 and 1. Categorical features, i.e. Static scheme , Deck material , Age class , Code Class , Obstacle type , and Alternatives , require a different strategy to be implemented. Since all categorical features are of the " Non-Orderable ” type, One -Hot Encoding (OHE, Müller and Guido, 2016 ) provides an effective solution. OHE is a method that encodes categorical inputs into an array of binary vectors. Each value is represented by a unique vector where all positions are marked as 0, except for the position corresponding to the category, which is assigned a value of 1. Table 3 summarizes all the discussed pre-processing strategies and the resulting encoded values.
Table 2. Statistical Description of selected features
IF
Type
Density
Orderability
Values
Total length (L)
Quantitative Quantitative Quantitative Quantitative Quantitative Categorical
Continuous Orderable Continuous Orderable
min = 1, max = 1943 min = 1, max = 167 min = 1, max = 56 min = 0, max = 70199 min = 0, max = 4158
Max span length (l) Number of spans (N)
Discrete
Orderable
ADT
Continuous Orderable Continuous Orderable
ADTT
Static scheme
Discrete
Non-Orderable
Supported Beams, Arc, Slab, Other
Prestressed Reinforced Concrete, Reinforced Concrete, Masonry, Composite, Other
Deck material
Categorical
Discrete
Non-Orderable
Age class Code class
Categorical Categorical
Discrete Discrete
Non-Orderable Non-Orderable
≤1945, 1945 - 1980, ≥1980 * Class A, Class B, Class C *
River, Primary Road Network, Secondary Road Network, Natural Discontinuity, Railway, Built-Up Area, Ditch
Obstacle type
Categorical
Discrete
Non-Orderable
Alternatives
Categorical Quantitative
Discrete Discrete
Non-Orderable
Yes, No
Maximum mass
Orderable
60 t, ≤44 t, ≤26 t, ≤8 t, ≤3,5 t *
*Ranges and classes as intended by MIT (2020)
4. Model creation and Tuning This paragraph is dedicated to create and optimize the ML model, choosing ANN as predictive algorithm. A brief description of the architecture parameters and hyperparameter to tune is here provided. The architecture of an ANN consists of multiple layers of neurons, where each neuron is characterized by an activation function and associated weights. These neurons are interconnected, with each connection also having its own weight. The training of an ANN is conducted using a supervised learning technique called “backpropagation”, which has the aim of tuning the network's weights to minimize a chosen error function (Negnevitsky, 2015). Network’s architecture and hyperparameters can significantly influence the performance (Keller, 2016) thus, in this study a technique called k fold Cross Validation (CV) is adopted to rationally find the best combination of architecture and hyperparameters values. The number of hidden layers and the number of neurons per layers are the first two hyperparameters considered. While the number of neurons in the hidden layers can vary, the number of neurons in the input layer is fixed and equal to the number of IF. Similarly, the number of neurons in the output layer is always set to match the number of possible Classes of Attention, including Low, Medium-Low, Medium, Medium-High, and High. Then, optimization algorithm needs to be chosen, given the fact that ANN performances depend on the adopted algorithm (Berahas and Takáč, 2020). The subsequent hyperparameters are a function of the optimization algorithm; in this work three optimization
Made with FlippingBook Ebook Creator