scispace - formally typeset
Journal ArticleDOI

Differential Evolution Training Algorithm for Feed-Forward Neural Networks

Reads0
Chats0
TLDR
In this study, differential evolution has been analyzed as a candidate global optimization method for feed-forward neural networks and seems not to provide any distinct advantage in terms of learning rate or solution quality.
Abstract
An evolutionary optimization method over continuous search spaces, differential evolution, has recently been successfully applied to real world and artificial optimization problems and proposed also for neural network training However, differential evolution has not been comprehensively studied in the context of training neural network weights, ie, how useful is differential evolution in finding the global optimum for expense of convergence speed In this study, differential evolution has been analyzed as a candidate global optimization method for feed-forward neural networks In comparison to gradient based methods, differential evolution seems not to provide any distinct advantage in terms of learning rate or solution quality Differential evolution can rather be used in validation of reached optima and in the development of regularization terms and non-conventional transfer functions that do not necessarily provide gradient information

read more

Citations
More filters
Journal ArticleDOI

Differential Evolution: A Survey of the State-of-the-Art

TL;DR: A detailed review of the basic concepts of DE and a survey of its major variants, its application to multiobjective, constrained, large scale, and uncertain optimization problems, and the theoretical studies conducted on DE so far are presented.
Journal ArticleDOI

Differential Evolution Algorithm With Strategy Adaptation for Global Numerical Optimization

TL;DR: This paper proposes a self- Adaptive DE (SaDE) algorithm, in which both trial vector generation strategies and their associated control parameter values are gradually self-adapted by learning from their previous experiences in generating promising solutions.
Journal ArticleDOI

Differential evolution algorithm with ensemble of parameters and mutation strategies

TL;DR: The performance of EPSDE is evaluated on a set of bound-constrained problems and is compared with conventional DE and several state-of-the-art parameter adaptive DE variants.
Proceedings ArticleDOI

Self-adaptive differential evolution algorithm for numerical optimization

TL;DR: A novel self-adaptive differential evolution algorithm (SaDE), where the choice of learning strategy and the two control parameters F and CR are not required to be pre-specified.
Journal ArticleDOI

Optimizing connection weights in neural networks using the whale optimization algorithm

TL;DR: The qualitative and quantitative results prove that the proposed WOA-based trainer is able to outperform the current algorithms on the majority of datasets in terms of both local optima avoidance and convergence speed.
References
More filters
Journal ArticleDOI

Differential Evolution – A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces

TL;DR: In this article, a new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented, which requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.
Journal ArticleDOI

Training feedforward networks with the Marquardt algorithm

TL;DR: The Marquardt algorithm for nonlinear least squares is presented and is incorporated into the backpropagation algorithm for training feedforward neural networks and is found to be much more efficient than either of the other techniques when the network contains no more than a few hundred weights.
Journal ArticleDOI

Evolving artificial neural networks

TL;DR: It is shown, through a considerably large literature review, that combinations between ANNs and EAs can lead to significantly better intelligent systems than relying on ANNs or EAs alone.

Differential Evolution - A simple and efficient adaptive scheme for global optimization over continuous spaces

Kenneth Price
TL;DR: A new heuristic approach for minimizing possibly nonlinear and non differentiable continuous space functions is presented and it will be demonstrated that the new method converges faster and with more certainty than Adaptive Simulated Annealing as well as the Annealed Nelder&Mead approach.
Related Papers (5)