PSI - Issue 80

Sadjad Naderi et al. / Procedia Structural Integrity 80 (2026) 77–92 Sadjad Naderi et al. / Structural Integrity Procedia 00 (2025) 000–000

84 8

Fig. 4. Schematic of the Bayesian hierarchical update process for a single observation step.

3.3.2. DBN architecture The hierarchical foundation in Fig. 4 naturally extends to sequential parameter evolution through a DBN structure. It converts the static model into a time-dependent system that updates continuously with diagnosis-driven data. The DBN follows a first-order Markov property: ( & | &9: ) = ( & ; & &9: , & ) (4) where & ∈ [0,1] controls the trade-off between retaining past information (preventing catastrophic forgetting) and adapting to new data. The DBN captures (i) parameter drift (gradual changes in material properties or systematic biases) and (ii) knowledge refinement (improved parameter estimates as more data become available). At each step, the system state & ={ & ,Σ & , & } contains parameter means ( & ), covariances ( Σ & ), and hyperparameters, allowing uncertainties propagate through time. The adaptive weighting supports robust online learning and enables transfer from offline datasets to streaming data. 3.3.3. Adaptive transfer learning mechanism The dynamic update step employs an adaptive transfer-learning mechanism that controls how newly acquired data modify existing beliefs. Adaptation is regulated by a scalar weight & , defined as the minimum of three reliability factors: stability , dat a, and diagnostics . Updates proceed only when all criteria indicate sufficient trust in the new information. The stability factor measures distributional shift between & and & , for example using a divergence metric, and is set to a high value (e.g. *&0)-;-&< =0.9) when the shift remains below a tolerance, and to a reduced value otherwise. The data factor reflects the information content of the online batch and is computed as '0&0 = X0.9, |$ !"#$"% | |$ !"#$"% |>?$ !&&#$"% ? [ (5) which increases adaptation weight as online evidence accumulates while capping the maximum influence to preserve prior knowledge. The diagnostics factor relies on sampler and estimator quality metrics, such as the presence of divergent transitions or a low effective sample size, to suppress adaptation when posterior estimates are unreliable. Weighting is applied per parameter so that updates remain isolated and poor estimates for one parameter do not contaminate others. The effective update is implemented as a tempered likelihood Bayesian update, ( &>: | &>: ) ∝ [ ( &>: | )] @ ' ( & ) (6) which reduces the impact of noisy or sparse observations while preserving coherence with Bayesian inference. The scheme enforces conservative and monotonic adaptation, since weights only reduce adaptation when evidence is

Made with FlippingBook - Online catalogs