scispace - formally typeset
Open AccessJournal ArticleDOI

A comparative study of differential evolution variants in constrained structural optimization

TLDR
This work examines the performance of several DE variants, namely the standard DE, the composite DE (CODE), the adaptive DE with optional external archive (JADE), the self-adaptive DE (JDE and SADE), for handling constrained structural optimization problems associated with truss structures.
Abstract
Differential evolution (DE) is a population-based metaheuristic algorithm that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. Such algorithms make few or no assumptions about the problem being optimized and can search very large spaces of candidate solutions. DE is arguably one of the most versatile and stable population-based search algorithms that exhibits robustness to multi-modal problems. In the field of structural engineering, most real-world optimization problems are associated with one or several constraints. Constrained optimization problems are often challenging to solve due to their complexity and high nonlinearity. In this work we examine the performance of several DE variants, namely the traditional DE, the composite DE (CODE), the adaptive DE with optional external archive (JADE) and the self-adaptive DE (JDE and SADE), for handling constrained structural optimization problems associated with truss structures. The performance of each DE variant is evaluated by using five well-known benchmark structures in 2D and 3D. The evaluation is done on the basis of final optimum result and the rate of convergence. Valuable conclusions are obtained from the statistical analysis which can help a structural engineer in practice to choose the suitable algorithm for these kind of problems.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Data-Driven Compressive Strength Prediction of Fly Ash Concrete Using Ensemble Learner Algorithms

TL;DR: This work is to compare ensemble deep neural network models, i.e., the super learner algorithm, simple averaging, weighted averaging, integrated stacking, as well as separate stacking ensemble models, andsuper learner models, in order to develop an accurate approach for estimating the compressive strength of FAC and reducing the high variance of the predictive models.
Journal ArticleDOI

Development of new machine learning model for streamflow prediction: case studies in Pakistan

TL;DR: In this article, a gradient-based optimization (GBO) algorithm is employed to adjust adaptive neuro-fuzzy system's (ANFIS) hyperparameters for accurate estimation of streamflow of a mountainous river basin.
Journal ArticleDOI

Wind Power Forecasting with Deep Learning Networks: Time-Series Forecasting

TL;DR: In this article, a temporal convolutional network (TCN) algorithm of DLNs was employed to obtain the correlations between meteorological features and power generation using a multilayer neural convolution with gradient descent algorithms to minimize estimation errors.
Journal ArticleDOI

Pure Random Orthogonal Search (PROS): A Plain and Elegant Parameterless Algorithm for Global Optimization

TL;DR: The results indicate that the proposed PROS method exhibits very good performance with fast convergence rates and quick execution time, which can serve as a simple alternative to established and more complex optimizers.
Journal ArticleDOI

Identification of the effective heat capacity–temperature relationship and the phase change hysteresis in PCMs by means of an inverse heat transfer problem solved with metaheuristic methods

TL;DR: In this paper, two metaheuristic optimisation methods were employed to solve an inverse heat transfer problem involving a phase change material (PCM) in order to identify the relationship c eff (T ) between the effective heat capacity and temperature during melting and solidification of the PCM.
References
More filters
Journal ArticleDOI

Differential Evolution – A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces

TL;DR: In this article, a new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented, which requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.
Journal ArticleDOI

Differential Evolution Algorithm With Strategy Adaptation for Global Numerical Optimization

TL;DR: This paper proposes a self- Adaptive DE (SaDE) algorithm, in which both trial vector generation strategies and their associated control parameter values are gradually self-adapted by learning from their previous experiences in generating promising solutions.
Journal ArticleDOI

Self-Adapting Control Parameters in Differential Evolution: A Comparative Study on Numerical Benchmark Problems

TL;DR: The results show that the algorithm with self-adaptive control parameter settings is better than, or at least comparable to, the standard DE algorithm and evolutionary algorithms from literature when considering the quality of the solutions obtained.
Journal ArticleDOI

JADE: Adaptive Differential Evolution With Optional External Archive

TL;DR: Simulation results show that JADE is better than, or at least comparable to, other classic or adaptive DE algorithms, the canonical particle swarm optimization, and other evolutionary algorithms from the literature in terms of convergence performance for a set of 20 benchmark problems.
Journal ArticleDOI

Parameter control in evolutionary algorithms

TL;DR: This paper revision the terminology, which is unclear and confusing, thereby providing a classification of such control mechanisms, and surveys various forms of control which have been studied by the evolutionary computation community in recent years.
Related Papers (5)
Trending Questions (1)
How does differential optimization contribute to the design of civil engineering structures?

Differential optimization contributes to the design of civil engineering structures by providing an efficient approach for handling difficult optimization problems.