New Parameter-Free Simplified Swarm Optimization for Artificial Neural Network Training and its Application in the Prediction of Time Series

IEEE Transactions on Neural Networks and Learning Systems
Publication Type:
Journal Article
IEEE Transactions on Neural Networks and Learning Systems, 2013, 24 (4), pp. 661 - 665
Issue Date:
Full metadata record
Files in This Item:
Filename Description Size
Thumbnail2012001174OK.pdfPublished Version164.42 kB
Adobe PDF
A new soft computing method called the parameterfree simplified swarm optimization (SSO)-based artificial neural network (ANN), or improved SSO for short, is proposed to adjust the weights in ANNs. The method is a modification of the SSO, and seeks to overcome some of the drawbacks of SSO. In the experiments, the iSSO is compared with five other famous soft computing methods, including the backpropagation algorithm, the genetic algorithm, the particle swarm optimization (PSO) algorithm, cooperative random learning PSO, and the SSO, and its performance is tested on five famous time-series benchmark data to adjust the weights of two ANN models (multilayer perceptron and single multiplicative neuron model). The experimental results demonstrate that iSSO is robust and more efficient than the other five algorithms.
Please use this identifier to cite or link to this item: