Adaptive surrogate-based optimization with dynamic boundary updating for structural problems

Majid ILCHI GHAZAAN , Mostafa SHARIFI

Front. Struct. Civ. Eng. ›› 2025, Vol. 19 ›› Issue (8) : 1355 -1372.

PDF (5403KB)
Front. Struct. Civ. Eng. ›› 2025, Vol. 19 ›› Issue (8) : 1355 -1372. DOI: 10.1007/s11709-025-1211-6
RESEARCH ARTICLE

Adaptive surrogate-based optimization with dynamic boundary updating for structural problems

Author information +
History +
PDF (5403KB)

Abstract

This paper introduces dynamic boundary updating-surrogate model-based (DBU-SMB), a novel evolutionary framework for global optimization that integrates dynamic boundary updating (DBU) within a surrogate model-based (SMB) approach. The method operates in three progressive stages: adaptive sampling, DBU, and refinement. In the first stage, adaptive sampling strategically explores the design space to gather critical information for improving the surrogate model. The second stage incorporates DBU to guide the optimization toward promising regions in the parameter space, enhancing consistency and efficiency. Finally, the refinement stage iteratively improves the optimization results, ensuring a comprehensive exploration of the design space. The proposed DBU-SMB framework is algorithm-agnostic, meaning it does not rely on any specific machine learning model or meta-heuristic algorithm. To demonstrate its effectiveness, we applied DBU-SMB to four highly nonlinear and non-convex optimization problems. The results show a reduction of over 90% in the number of function evaluations compared to traditional methods, while avoiding entrapment in local optima and discovering superior solutions. These findings highlight the efficiency and robustness of DBU-SMB in achieving optimal designs, particularly for large-scale and complex optimization problems.

Graphical abstract

Keywords

machine learning / surrogate model / adaptive sampling / XGBoost / structural optimization

Cite this article

Download citation ▾
Majid ILCHI GHAZAAN, Mostafa SHARIFI. Adaptive surrogate-based optimization with dynamic boundary updating for structural problems. Front. Struct. Civ. Eng., 2025, 19(8): 1355-1372 DOI:10.1007/s11709-025-1211-6

登录浏览全文

4963

注册一个新账户 忘记密码

1 Introduction

Finite element analysis (FEA) is a powerful tool for structural analysis, but its high computational cost becomes a significant challenge when applied to large-scale optimization problems. In global optimization, where the goal is to minimize an objective function by iteratively evaluating design variables, the frequent use of FEA can lead to prohibitive computational expenses. To address this issue, surrogate models simplified representations of complex systems have emerged as an efficient alternative. These models approximate the behavior of the original system with reasonable accuracy while significantly reducing computational costs.

Surrogate models, also known as metamodels, approximate the behavior of complex systems using simplified representations. These models are trained on a limited set of data and can be iteratively refined to improve accuracy. Various machine learning techniques, such as support vector machines, Kriging, neural networks, and XGBoost, have been successfully employed as surrogate models in structural optimization and reliability analysis [14]. For instance, Mai et al. [5] integrated deep neural networks with differential evolution for truss optimization, while Lieu et al. [6] used adaptive surrogate models for structural reliability analysis. Similarly, Nourian et al. [7] combined graph neural networks with particle swarm optimization for truss design optimization, and Lee et al. [8] employed deep neural networks for structural damage detection. Javanmardi and Ahmadi-Nedushan [9] proposed an optimized artificial neural network as a surrogate model for solving structural optimization problems, demonstrating its effectiveness in reducing computational costs for a double-layer barrel vault design. Li et al. [10] proposed a two-step method for reliability analysis using surrogate models and Monte Carlo simulation, while Luo et al. [11] developed a hybrid simulation approach for structural reliability analysis. Cheng and Lu [12] introduced an adaptive method for reliability analysis using ensemble learning, and Yang et al. [13] proposed a surrogate model-based method for reliability-oriented buckling topology optimization.

Recent studies have demonstrated the effectiveness of Kriging-based surrogate modeling in handling complex systems with high-dimensional input spaces and uncertainty quantification. A Kriging-based framework was introduced by Vu-Bac et al. [14] for stochastic prediction of mechanical properties in polymeric nanocomposites, highlighting its applicability in surrogate modeling under uncertainty. The same research group further explored uncertainty quantification using atomistic models [15], reinforcing the role of surrogate models in capturing nonlinear behavior with limited data. However, Kriging often suffers from scalability issues and strong assumptions related to Gaussian processes, making it less suitable for highly nonlinear and non-convex problems.

To address these limitations, we employ XGBoost, a gradient-boosted decision tree algorithm that excels at modeling complex, high-dimensional, and nonlinear relationships. Unlike Kriging, XGBoost does not rely on probabilistic assumptions, offers better scalability, and efficiently handles large data sets and noisy data, making it well-suited for surrogate modeling in computationally intensive structural optimization problems [1620]. Feng et al. [21] used XGBoost to predict the shear strength of RC shear walls, while Zhang et al. [22] applied it to predict the maximum stress of lattice structures in additive manufacturing. Wu et al. [23] utilized the XGBoost Random Forest (XGBRF) algorithm for real-time prediction of tunnel face conditions, demonstrating superior performance compared to other machine learning techniques. Alshboul et al. [24] employed XGBoost for shear strength prediction in steel fiber-reinforced concrete beams. Truong et al. [25] developed a machine learning framework using XGBoost for bi-objective optimization of nonlinear steel structures.

Despite these advancements, challenges remain in ensuring the accuracy and efficiency of surrogate models, particularly in high-dimensional and non-convex optimization problems. Adaptive sampling strategies have been proposed to address these challenges by dynamically updating the training data based on model predictions, thereby improving accuracy in critical regions of the design space. For example, Roy and Chakraborty [26] used support vector regression with adaptive sampling for reliability analysis, while Echard et al. [27] combined Kriging with Monte Carlo simulation for efficient sampling. Liu et al. [28] proposed an adaptive stochastic configuration network ensemble for structural reliability analysis, and Proverbio et al. [29] developed an adaptive sampling methodology using radial basis functions. However, existing methods often struggle with balancing exploration and exploitation, especially in problems with complex constraints.

In this study, we propose a novel framework, dynamic boundary updating-surrogate model-based (DBU-SMB), to address these challenges. The framework integrates adaptive sampling with a dynamic boundary updating (DBU) mechanism to guide the optimization process toward promising regions of the design space. By iteratively refining the search boundaries and leveraging meta-heuristic algorithms, DBU-SMB significantly reduces the number of function evaluations required for convergence while avoiding local optima. The framework is demonstrated on several highly nonlinear and non-convex truss optimization problems, showcasing its efficiency and robustness.

The meta-heuristic algorithms used in this study include the artificial hummingbird algorithm (AHA) [30], the equilibrium optimizer (EO) [31], and the golden eagle optimizer (GEO) [32]. These algorithms are chosen for their ability to handle complex optimization problems with multiple constraints. Truss optimization problems, such as those involving frequency constraints, are particularly challenging due to their high nonlinearity and non-convexity [3336]. In this study, we focus on optimizing truss structures under frequency constraints to identify their global optimum. We chose these problems because they are highly nonlinear, non-convex, and implicit in nature [3741].

The structure of the rest of this paper is as follows. Section 2 presents the surrogate model chosen for use in this study. The evolutionary framework is discussed in Section 3. The numerical results are presented and discussed in Section 4. Finally, the conclusion is provided in Section 5.

2 Surrogate model: XGBoost

XGBoost, an advanced machine learning algorithm based on gradient boosting decision trees, is widely recognized for its efficiency and accuracy in handling complex regression and classification tasks [42]. Its ability to model nonlinear relationships and handle large data sets makes it a suitable choice for surrogate modeling in structural optimization problems.

The XGBoost algorithm operates by aggregating predictions from an ensemble of decision trees. For a given data set with n samples, the predicted value ŷi for the ith sample is computed as:

y^i=k=1Kfk(xi),fkF,

where K is the number of trees, xi represents the input features, and fk denotes the prediction from the kth tree. The set F represents the space of regression trees, defined as:

F={f(x)=wq(x)},q:RmT,wRT,

where q(x) maps the input x to a leaf index, and w represents the weight of the leaf. Each tree fk is characterized by its structure q and leaf weights w.

XGBoost optimizes an objective function that combines a loss function and a regularization term. For regression tasks, the objective function is defined as:

Obj=i=1nL(yi,y^i)+k=1KΩ(fk),

where L(yi,ŷi) measures the difference between the true value yi and the predicted value ŷi, and Ω(fk) is the regularization term that penalizes model complexity. The regularization term is given by:

Ω(fk)=γT+12λj=1Twj2,

where T is the number of leaves in the tree, wj is the weight of the jth leaf, and γ and λ are regularization parameters controlling tree complexity and leaf weights, respectively.

XGBoost incorporates several advanced features, including parallel processing, tree pruning, and sparsity awareness, which make it highly scalable and efficient for large data sets. Its ability to handle missing data and optimize memory usage further enhances its suitability for surrogate modeling in computationally expensive optimization tasks.

In this study, XGBoost is employed as the surrogate model due to its proven performance in approximating complex, nonlinear functions. Its efficiency in handling large data sets and providing accurate predictions makes it an ideal choice for reducing the computational cost of global optimization problems.

3 Proposed method: dynamic boundary updating-surrogate model-based

The DBU-SMB framework integrates adaptive sampling with the DBU method to efficiently locate the global optimum. It leverages surrogate models and operates through three stages: adaptive sampling, DBU-based boundary optimization, and refinement, ensuring reliability and accuracy.

3.1 Design of experiments

The initial data set is critical for training the surrogate model. A design of experiments (DOE) approach is used to generate this data set, ensuring efficient exploration of the design space.

1) Latin hypercube sampling (LHS): LHS is employed to uniformly distribute sample points across the design space. LHS is efficient for space-filling and works well with limited initial samples.

2) Number of samples (ns): The initial data set size is set to 10 times the number of design variables (nv), balancing computational cost and model accuracy for most problems.

3) Computational cost considerations: While a smaller DOE reduces computational cost, it may result in a low-fidelity surrogate model. To address this, the data set is iteratively refined to improve model accuracy during the optimization process.

3.2 Performance metrics

The DBU-SMB framework employs three key performance metrics to evaluate the surrogate model’s accuracy and guide the optimization process.

1) Frequency-constrained error (FCE): Measures the relative error between the predicted frequency (fpredicted,i) and the constrained frequency (fconstrained,i) for each constraint i:

FCEi=|fpredicted,ifconstrained,i|fconstrained,i.

2) Conditional frequency-constrained error (CFCE): The CFCE is a binary metric that checks if a frequency constraint is satisfied. If the constraint is met, the error is zero; otherwise, it calculates the FCE. For n conditions, the CFCE is defined as:

CFCE=1ni=1n{0,ifconditionissatisfied,FCEi,otherwise.

3) Mean conditional frequency-constrained error (MCFCE): Averages the CFCE values across all predictions (s) to assess overall model performance:

MCFCE=1si=1nCFCEi.

These metrics ensure that the surrogate model’s predictions align with the structural constraints, enabling the optimization process to converge efficiently toward feasible and optimal solutions.

3.3 Dynamic boundary updating

The DBU method is a core component of the DBU-SMB framework, designed to adaptively refine the search space during optimization. By dynamically adjusting the boundaries of design variables based on the surrogate model’s predictions, DBU ensures that the optimization process converges efficiently toward the global optimum while maintaining feasibility. The DBU method consists of three key components: the optimal weight ensemble matrix (OWEM), the dynamic boundary matrix (DBM), and the boundary updating process.

3.3.1 Optimal weight ensemble matrix

The OWEM is constructed to retain the most accurate and lightweight predictions during the optimization process. It combines two primary objectives: minimizing the CFCE and reducing the weight of the structure. The OWEM is populated with predictions that exhibit both the lowest CFCE values and the lightest weights, ensuring that only the most reliable solutions are considered. The number of rows in the OWEM is determined as follows:

nr=min([ns,100]).

In this step, the lightest weight and yet lowest CFCE values are used to fill the matrix, which ensures that only the most reliable predictions with the smallest errors are included.

3.3.2 Dynamic boundary matrix

The DBM is derived from the OWEM and is used to dynamically adjust the search space boundaries. The DBM is partitioned into two halves.

1) First half: Contains the lightest predictions with the lowest CFCE values, representing the most reliable solutions at the current optimization stage.

2) Second half: Contains the remaining lightest predictions, excluding those already included in the first half.

The DBM is expressed as:

DBM=[DBMfirsthalfDBMsecondhalf].

The number of rows in each half is calculated as:

nr=nv/2.

By splitting the OWEM into two halves, the DBM ensures that the optimization process focuses on the most promising regions of the design space while gradually incorporating newer predictions.

3.3.3 Boundary updating process

The boundary updating process uses the DBM to adjust the search space boundaries for each design variable. The goal is to narrow the search space around the most promising regions, improving the efficiency of the optimization process. The new boundaries are calculated based on the minimum and maximum values of each column in the DBM, with a safe margin added to prevent overly restrictive boundaries.

To advance the method and calculate the new boundaries, it is necessary to introduce two parameters lbj and ubj, whose calculation formula is as follows:

lbj=min(DBM1,j,,DBMn,j)δ/2,ubj=max(DBM1,j,,DBMn,j)+δ/2,

where n=1,2,,nr, j=1,2,,nv, and δ is an additional parameter to form a safe margin around the boundaries, which in this study is defined as 10% of the difference between the maximum and minimum values calculated from the DBM.

Finally, the new boundaries are determined based on the following formulas:

lbjnew=lbjold+ratejlb×(lbjlbjold),ubjnew=ubjold+ratejub×(ubjubjold),

where lbjnew and ubjnew are the new lower and upper boundary, respectively. The rate parameter ensures a gradual update rate of the boundaries, preventing sudden or drastic changes in the design space, and in addition, quickly correcting the path if it detects a wrong path, which is determined according to Eq. (13):

rateilb={1,iflbjlbjold0,0.1,otherwise,rateiub={1,ifubjubjold0,0.1,otherwise.

Fig.1 shows the process of discovering the global optimum during the boundary updating process. According to the figure, the importance of introducing each of the formulas and parameters in this section can be seen. For example, the importance of considering a safe margin and the presence of the rate parameter is clearly visible. It is worth noting that if the optimization path was uncorrectable, we would be stuck in a local trap and the global optimum could not be discovered.

3.4 Dynamic boundary updating-surrogate model based

The DBU-SMB method is a three-stage evolutionary framework designed to efficiently solve complex global optimization problems. The method integrates adaptive sampling, DBU, and refinement to systematically explore the design space, focus on promising regions, and refine the final solution. A schematic overview of the DBU-SMB workflow is presented in Fig.2, which illustrates the iterative nature of the algorithm. Below, we describe each stage in detail.

3.4.1 Stage 1: adaptive sampling

The optimization process begins with adaptive sampling, where a surrogate model is trained on an initial data set generated using LHS. This ensures a uniform exploration of the design space. The surrogate model predicts the most promising design points, and the corresponding outputs are computed using FEA. These new data points are added to the data set, and the surrogate model is retrained iteratively. This process gradually improves the model’s accuracy in identifying optimal regions, as shown in Fig.3, where predictions become concentrated around optimal points.

3.4.2 Stage 2: adaptive sampling + dynamic boundary updating

In the second stage, the DBU method is introduced to refine the optimization boundaries based on the surrogate model’s predictions. DBU dynamically adjusts the search space for each design variable, focusing the optimization on regions with the highest likelihood of containing the global optimum. This adaptive adjustment ensures that the search remains efficient and avoids unnecessary exploration of less promising areas. As illustrated in Fig.4, the predictions become centered around the global optimum, guiding the optimization process toward more efficient convergence.

3.4.3 Stage 3: adaptive sampling + dynamic boundary updating + refinement

The refinement stage improves solution precision by applying a local gradient-based search to fine-tune the optimal point found in Stage 2. We use an interior-point method for constrained optimization, though other approaches like the Method of Moving Asymptotes [43,44] can also be applied. Refinement is limited to the most promising predictions to maintain computational efficiency. This final stage delivers a high-quality solution while preserving the global search perspective established earlier. Fig.5 illustrates the operation of the refinement process, showing how the method systematically refines the solution space.

3.5 The dynamic boundary updating-surrogate model-based algorithm

The pseudocode of proposed methodology is illustrated in Algorithm 1 and, for better understanding, in a flowchart form in Fig.6.

Algorithm 1 DBU-SMB

1: ns=min([10nv100])

2: Generate design of experiments by LHS (x1,,xns)

3: Calculate output for each design using FEA (y1,,yns)

4: Set data {(xi,yi),i=1,2,,ns}

5: procedure DBU-SMB

6: While (stopping criteria is not reached) do

7: Construct surrogate model from data

8: m ← number of data

9: Solve surrogate optimization problem to find optimum point (xm+1)

10: ym+1 is calculated

11: Update the data set by adding the new data point (xm+1,ym+1)

12: if (MCFCE<2% or FEs>2×ns) then

13: Cond. 1: MCFCE<1%; Cond. 2: FEs>3×ns; Cond.3: FCEi<1%; Cond. 4: costpredicted<costbest

14: if (all four conditions above hold) then

15: m ← number of data

16: Refinement xm by interior point algorithm to find xm+1

17: ym+1 is calculated

18: update data with (xm+1,ym+1)

19: end if

20: Update the optimization boundaries using the DBU method as described in Eq. (13)

21: end if

22: end while

23: Return xbest, costbest

24: end procedure

3.6 Stopping criteria

Stopping criteria are crucial for ensuring the optimization process terminates efficiently without unnecessary computational overhead. In the DBU-SMB method, we employ an adaptive stopping criterion based on the standard deviation of weights derived from the DBM. This approach ensures a balance between computational efficiency and solution accuracy.

The standard deviation of weights from the DBM reflects the stability and diversity of design solutions during optimization. A low standard deviation indicates convergence, suggesting that further iterations are unlikely to yield significant improvements. In this study, a threshold below 1 is used to signal convergence, ensuring minimal variation between iterations. This threshold can be adjusted based on the problem’s complexity, with higher thresholds for simpler problems to allow adequate exploration and lower thresholds for complex problems to accelerate convergence.

By using this adaptive stopping criterion, the DBU-SMB method avoids arbitrary termination conditions (e.g., maximum function evaluations) and ensures the optimization process stops only when further improvements are negligible.

4 Numerical examples

To evaluate the effectiveness of the proposed DBU-SMB framework, we applied it to four truss optimization problems with varying levels of complexity: the 10-bar plane truss, 120-bar dome truss, 200-bar planar truss, and 600-bar dome truss. These problems were selected to test the method’s ability to handle both small-scale and large-scale optimization tasks, particularly under challenging frequency constraints. The performance of DBU-SMB was assessed using three meta-heuristic algorithms: the AHA, EO, and GEO. To demonstrate the contribution of each stage in the framework, each problem was solved in three operational modes: Mode 1 (adaptive sampling only), Mode 2 (adaptive sampling + DBU), and Mode 3 (adaptive sampling + DBU + refinement). This approach allowed us to systematically evaluate how each stage improves the optimization process, with Mode 1 providing initial exploration, Mode 2 refining the search boundaries, and Mode 3 achieving precise final solutions. By comparing the results across these modes, we highlight the incremental benefits of each stage in the DBU-SMB framework.

4.1 10-bar plane truss

The 10-bar plane truss, shown in Fig.7, is a benchmark problem frequently used in structural optimization studies [45,46]. The design variables are the cross-sectional areas of each member, with a modulus of elasticity of 68.95 GPa and a material density of 2767.99 kg/m3. Nonstructural masses of 453.6 kg are attached to free nodes (1–4). The first three natural frequencies are constrained as follows: f1 ≥ 7 Hz, f2 ≥ 15 Hz, and f3 ≥ 20 Hz. The cross-sectional areas are bounded between 0.645 and 50 cm2.

The optimization results, summarized in Tab.1, show that DBU-SMB significantly outperforms the traditional approach. The best weight achieved was 530.8087 kg using the EO algorithm, with a 97% reduction in function evaluations compared to the traditional method. Tab.2 presents the results under different operational modes, highlighting the progressive improvement in solution quality as additional stages are introduced. For example, in Mode 3, the weight of the truss was reduced by approximately 4% compared to Mode 1, demonstrating the effectiveness of the refinement stage. Fig.8 illustrates the convergence of optimization boundaries for the 10 design variables, showing how DBU-SMB dynamically adjusts the search space to focus on promising regions.

4.2 120-bar dome truss

The 120-bar dome truss problem is shown in Fig.9, consists of 7 design variables, making it less complex compared to other problems in this study [47,48]. The modulus of elasticity is 210 GPa, and the material density is 7971.81 kg/m3. Nonstructural masses are attached to the free nodes: 3000 kg at Node 1500 kg at Nodes 2–13, and 100 kg at the remaining nodes. The first and second natural frequencies are constrained to f1 ≥ 9 Hz and f2 ≥ 11 Hz. The cross-sectional areas of the members range between 1 and 129.3 cm2.

The optimization results, presented in Tab.3, demonstrate that DBU-SMB achieves an optimal weight of 8889.2701 kg with fewer than 600 function evaluations, representing a 97% reduction compared to the traditional approach. Tab.4 shows the impact of adding each stage of the DBU-SMB method. For instance, in Mode 3, the weight of the truss was reduced by approximately 4.5% compared to Mode 1, highlighting the importance of the refinement stage. Fig.10 depicts the convergence of optimization boundaries for the 7 design variables, illustrating how DBU-SMB effectively narrows the search space to focus on the global optimum.

4.3 200-bar planar truss

The 200-bar planar truss, shown in Fig.11, represents one of the most challenging global optimization problems in this study due to its 29 design variables [49,50]. The modulus of elasticity is 210 GPa, and the material density is 7860 kg/m3. Nonstructural masses of 100 kg are attached to Nodes 1–5. The natural frequencies must satisfy the constraints f1 ≥ 5 Hz, f2 ≥ 10 Hz, and f3 ≥ 15 Hz. The cross-sectional areas have a lower bound of 0.1 cm2.

The optimization results, summarized in Tab.5, show that DBU-SMB achieves a 95% reduction in function evaluations while maintaining solution quality. The EO algorithm performed particularly well, yielding an optimal weight of 2172.68 kg. Tab.6 presents the results under different operational modes. In Mode 3, the weight of the truss was reduced by approximately 30% compared to Mode 1, demonstrating the significant impact of the refinement stage. Fig.12 illustrates the convergence of optimization boundaries for the 29 design variables. Notably, the DBU-SMB method demonstrated adaptability by correcting initial boundary errors during the optimization process, ensuring convergence to the global optimum.

4.4 600-bar dome truss

The 600-bar dome truss problem, shown in Fig.13, is a global optimization problem with 25 design variables [51,52]. The modulus of elasticity is 200 GPa, and the material density is 7850 kg/m3. Nonstructural masses of 100 kg are attached to all free nodes. The natural frequency constraints are f1 ≥ 5 Hz and f3 ≥ 7 Hz. The cross-sectional areas of the members are bounded between 1 and 100 cm2.

The optimization results, presented in Tab.7, show that DBU-SMB achieves a 93% reduction in function evaluations, with the EO algorithm yielding the best performance (optimal weight: 6171.0271 kg). Tab.8 highlights the impact of adding each stage of the DBU-SMB method. In Mode 3, the weight of the truss was reduced by approximately 60% compared to Mode 1, underscoring the importance of the refinement stage. Fig.14 illustrates the convergence of optimization boundaries for the 25 design variables, showing how DBU-SMB dynamically adjusts the search space to focus on promising regions.

5 Conclusions

In this study, we introduced the DBU-SMB framework, an evolutionary approach for global optimization that integrates adaptive sampling, DBU, and refinement stages. The proposed method leverages XGBoost as a surrogate model to significantly reduce computational costs while maintaining solution quality. Our results demonstrate that DBU-SMB reduces the number of function evaluations by 93% to 97% compared to traditional methods, making it highly efficient for solving complex optimization problems. Importantly, the framework avoids entrapment in local optima, ensuring robust global solutions even for high-dimensional and nonlinear problems.

The effectiveness of DBU-SMB was validated through the optimization of truss structures under frequency constraints, including the 10-bar, 120-bar, 200-bar, and 600-bar dome truss problems. These problems, characterized by their high nonlinearity and non-convexity, highlight the method’s ability to handle large-scale optimization tasks efficiently. The integration of DBU-SMB with various meta-heuristic algorithms (AHA, EO, and GEO) further underscores its versatility and adaptability, as it consistently improved solution quality across different algorithms.

The key contributions of this work include a substantial reduction in computational cost achieved through adaptive sampling and surrogate modeling, the dynamic adjustment of design boundaries to concentrate the search on promising regions, and a refinement stage that improves solution precision. These features collectively position DBU-SMB as a highly effective tool for engineering design, especially in addressing large-scale structures with complex constraints. This study demonstrates that DBU-SMB is a robust and efficient framework for global optimization, offering significant improvements in both computational efficiency and solution quality.

References

[1]

Ghiasi R, Ghasemi M R. Optimization-based method for structural damage detection with consideration of uncertainties––A comparative study.Smart Structures and Systems, 2018, 22(5): 561–574

[2]

Juliani M A, Gomes W J. An efficient Kriging-based framework for computationally demanding constrained structural optimization problems.Structural and Multidisciplinary Optimization, 2022, 65(1): 4

[3]

Peng L, Liu L, Long T, Yang W. An efficient truss structure optimization framework based on CAD/CAE integration and sequential radial basis function metamodel.Structural and Multidisciplinary Optimization, 2014, 50(2): 329–346

[4]

Lee S, Kim H, Lieu Q X, Lee J. CNN-based image recognition for topology optimization.Knowledge-Based Systems, 2020, 198: 105887

[5]

Mai H T, Kang J, Lee J. A machine learning-based surrogate model for optimization of truss structures with geometrically nonlinear behavior.Finite Elements in Analysis and Design, 2021, 196: 103572

[6]

Lieu Q X, Nguyen K T, Dang K D, Lee S, Kang J, Lee J. An adaptive surrogate model to structural reliability analysis using deep neural network.Expert Systems with Applications, 2022, 189: 116104

[7]

Nourian N, El-Badry M, Jamshidi M. Design optimization of truss structures using a graph neural network-based surrogate model.Algorithms, 2023, 16(8): 380

[8]

Lee S, Park S, Kim T, Lieu Q X, Lee J. Damage quantification in truss structures by limited sensor-based surrogate model.Applied Acoustics, 2021, 172: 107547

[9]

Javanmardi R, Ahmadi-Nedushan B. Optimal design of double-layer barrel vaults using genetic and pattern search algorithms and optimized neural network as surrogate model.Frontiers of Structural and Civil Engineering, 2023, 17(3): 378–395

[10]

Li W, Geng R, Chen S. CSP-free adaptive Kriging surrogate model method for reliability analysis with small failure probability.Reliability Engineering & System Safety, 2024, 243: 109898

[11]

Luo C, Keshtegar B, Zhu S P, Niu X. EMCS-SVR: Hybrid efficient and accurate enhanced simulation approach coupled with adaptive SVR for structural reliability analysis.Computer Methods in Applied Mechanics and Engineering, 2022, 400: 115499

[12]

Cheng K, Lu Z. Structural reliability analysis based on ensemble learning of surrogate models.Structural Safety, 2020, 83: 101905

[13]

Yang B, Wang X, Cheng C Z, Lee I, Hu Z J. Surrogate model-based method for reliability-oriented buckling topology optimization under random field load uncertainty.Structures, 2024, 63: 106382

[14]

Vu-Bac N, Silani M, Lahmer T, Zhuang X, Rabczuk T. A unified framework for stochastic predictions of mechanical properties of polymeric nanocomposites.Computational Materials Science, 2015, 96: 520–535

[15]

Vu-Bac N, Zhuang X, Rabczuk T. Uncertainty quantification for mechanical properties of polyethylene based on fully atomistic model.Materials, 2019, 12(21): 3613

[16]

Chen T, He T. XGboost: Extreme Gradient Boosting.R Package Version 0.4-2, 2015, 1(4): 1–4

[17]

Le L T, Nguyen H, Zhou J, Dou J, Moayedi H. Estimating the heating load of buildings for smart city planning using a novel artificial intelligence technique PSO-XGBoost.Applied Sciences, 2019, 9(13): 2714

[18]

Nguyen H, Drebenstedt C, Bui X N, Bui D T. Prediction of blast-induced ground vibration in an open-pit mine by a novel hybrid model based on clustering and artificial neural network.Natural Resources Research, 2020, 29(2): 691–709

[19]

Zhang W, Zhang R, Wu C, Goh A T C, Wang L. Assessment of basal heave stability for braced excavations in anisotropic clay using extreme gradient boosting and random forest regression.Underground Space, 2022, 7(2): 233–241

[20]

Zhang W, Wu C, Zhong H, Li Y, Wang L. Prediction of undrained shear strength using extreme gradient boosting and random forest based on Bayesian optimization.Geoscience Frontiers, 2021, 12(1): 469–477

[21]

Feng D C, Wang W J, Mangalathu S, Taciroglu E. Interpretable XGBoost-SHAP machine-learning model for shear strength prediction of squat RC walls.Journal of Structural Engineering, 2021, 147(11): 04021173

[22]

Zhang Z, Zhang Y, Wen Y, Ren Y. Data-driven XGBoost model for maximum stress prediction of additive manufactured lattice structures.Complex & Intelligent Systems, 2023, 9(5): 1–12

[23]

Wu L, Li X, Yuan J, Wang S. Real-time prediction of tunnel face conditions using XGBoost random forest algorithm.Frontiers of Structural and Civil Engineering, 2023, 17(12): 1777–1795

[24]

Alshboul O, Almasabha G, Shehadeh A, Al-Shboul K. A comparative study of LightGBM, XGBoost, and GEP models in shear strength management of SFRC-SBWS.Structures, 2024, 61: 106009

[25]

Truong V H, Cao T S, Tangaramvong S. A robust machine learning-based framework for handling time-consuming constraints for bi-objective optimization of nonlinear steel structures.Structures, 2024, 62: 106226

[26]

Roy A, Chakraborty S. Support vector regression based metamodel by sequential adaptive sampling for reliability analysis of structures.Reliability Engineering & System Safety, 2020, 200: 106948

[27]

Echard B, Gayton N, Lemaire M. AK-MCS: An active learning reliability method combining Kriging and Monte Carlo simulation.Structural Safety, 2011, 33(2): 145–154

[28]

Liu H, Li S, Huang X, Ding P, Jiang Z. Adaptive stochastic configuration network ensemble for structural reliability analysis.Expert Systems with Applications, 2024, 237: 121633

[29]

Proverbio M, Costa A, Smith I F. Adaptive sampling methodology for structural identification using radial-basis functions.Journal of Computing in Civil Engineering, 2018, 32(3): 04018008

[30]

Zhao W, Wang L, Mirjalili S. Artificial hummingbird algorithm: A new bio-inspired optimizer with its engineering applications.Computer Methods in Applied Mechanics and Engineering, 2022, 388: 114194

[31]

Faramarzi A, Heidarinejad M, Stephens B, Mirjalili S. Equilibrium optimizer: A novel optimization algorithm.Knowledge-Based Systems, 2020, 191: 105190

[32]

Mohammadi-Balani A, Dehghan Nayeri M, Azar A, Taghizadeh-Yazdi M. Golden eagle optimizer: A nature-inspired metaheuristic algorithm.Computers & Industrial Engineering, 2021, 152: 107050

[33]

Renkavieski C, Parpinelli R S. Meta-heuristic algorithms to truss optimization: Literature mapping and application.Expert Systems with Applications, 2021, 182: 115197

[34]

Miguel L F F, Miguel L F F. Shape and size optimization of truss structures considering dynamic constraints through modern metaheuristic algorithms.Expert Systems with Applications, 2012, 39(10): 9458–9467

[35]

Cao H, Qian X, Chen Z, Zhu H. Enhanced particle swarm optimization for size and shape optimization of truss structures.Engineering Optimization, 2017, 49(11): 1939–1956

[36]

Jafari M, Salajegheh E, Salajegheh J. Optimal design of truss structures using a hybrid method based on particle swarm optimizer and cultural algorithm.Structures, 2021, 32: 391–405

[37]

Kaveh A, Zolghadr A. Meta-heuristic methods for optimization of truss structures with vibration frequency constraints.Acta Mechanica, 2018, 229(10): 3971–3992

[38]

Farshchin M, Camp C V, Maniat M. Optimal design of truss structures for size and shape with frequency constraints using a collaborative optimization strategy.Expert Systems with Applications, 2016, 66: 203–218

[39]

Tejani G G, Savsani V J, Patel V K, Mirjalili S. Truss optimization with natural frequency bounds using improved symbiotic organisms search.Knowledge-Based Systems, 2018, 143: 162–178

[40]

Millan-Paramo C, Abdalla Filho J E. Size and shape optimization of truss structures with natural frequency constraints using modified simulated annealing algorithm.Arabian Journal for Science and Engineering, 2020, 45(5): 3511–3525

[41]

Rezaeizadeh A, Zandi M, Ilchi Ghazaan M. Sensitivity of optimal double-layer grid designs to geometrical imperfections and geometric nonlinearity conditions in the analysis phase.Frontiers of Structural and Civil Engineering, 2024, 18(8): 1209–1224

[42]

ChenTGuestrinC. XGboost: A scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining. San Francisco, CA: Association for Computing Machiner, 2016, 785–794

[43]

Vu-Bac N, Duong T X, Lahmer T, Zhuang X, Sauer R A, Park H S, Rabczuk T. A NURBS-based inverse analysis for reconstruction of nonlinear deformations of thin shell structures.Computer Methods in Applied Mechanics and Engineering, 2018, 331: 427–455

[44]

Vu-Bac N, Rabczuk T, Park H S, Fu X, Zhuang X. A NURBS-based inverse analysis of swelling induced morphing of thin stimuli-responsive polymer gels.Computer Methods in Applied Mechanics and Engineering, 2022, 397: 115049

[45]

Lieu Q X, Do D T, Lee J. An adaptive hybrid evolutionary firefly algorithm for shape and size optimization of truss structures with frequency constraints.Computers & Structures, 2018, 195: 99–112

[46]

Ho-Huu V, Vo-Duy T, Luu-Van T, Le-Anh L, Nguyen-Thoi T. Optimal design of truss structures with frequency constraints using improved differential evolution algorithm based on an adaptive mutation scheme.Automation in Construction, 2016, 68: 81–94

[47]

Kaveh A, Ghazaan M I. Hybridized optimization algorithms for design of trusses with multiple natural frequency constraints.Advances in Engineering Software, 2015, 79: 137–147

[48]

Vaez S R H, Mehanpour H, Fathali M A. Reliability assessment of truss structures with natural frequency constraints using metaheuristic algorithms.Journal of Building Engineering, 2020, 28: 101065

[49]

Khodadadi N, Snasel V, Mirjalili S. Dynamic arithmetic optimization algorithm for truss optimization under natural frequency constraints.IEEE Access: Practical Innovations, Open Solutions, 2022, 10: 16188–16208

[50]

Mortazavi A. Size and layout optimization of truss structures with dynamic constraints using the interactive fuzzy search algorithm.Engineering Optimization, 2021, 53(3): 369–391

[51]

Carvalho J P, Lemonge A C C, Carvalho É C R, Hallak P H, Bernardino H S. Truss optimization with multiple frequency constraints and automatic member grouping.Structural and Multidisciplinary Optimization, 2018, 57(2): 547–577

[52]

Degertekin S, Bayar G Y, Lamberti L. Parameter free Jaya algorithm for truss sizing-layout optimization under natural frequency constraints.Computers & Structures, 2021, 245: 106461

RIGHTS & PERMISSIONS

Higher Education Press

AI Summary AI Mindmap
PDF (5403KB)

382

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/