A Clustering-based Differential Evolution Boosted by a Regularisation-based Objective Function and a Local Refinement for Neural Network Training

Publisher:
IEEE
Publication Type:
Conference Proceeding
Citation:
2022 IEEE Congress on Evolutionary Computation (CEC), 2022, 00, pp. 1-8
Issue Date:
2022-09-06
Full metadata record
The performance of feed-forward neural networks (FFNN) is directly dependant on the training algorithm. Conventional training algorithms such as gradient-based approaches are so popular for FFNN training, but they are susceptible to get stuck in local optimum. To overcome this, population-based metaheuristic algorithms such as differential evolution (DE) are a reliable alternative. In this paper, we propose a novel training algorithm, Reg-IDE, based on an improved DE algorithm. Weight regularisation in conventional algorithms is an approach to reduce the likelihood of over-fitting and enhance generalisation. However, to the best of our knowledge, the current DE-based trainers do not employ regularisation. This paper, first, proposes a regularisation-based objective function to improve the generalisation of the algorithm by adding a new term to the objective function. Then, a region-based strategy determines some regions in search space using a clustering algorithm and updates the population based on the information available in each region. In addition, quasi opposition-based learning enhances the exploration of the algorithm. The best candidate solution found by improved DE is then used as the initial network weights for the Levenberg-Marquardt (LM) algorithm, as a local refinement. Experimental results on different benchmarks and in comparison with 26 conventional and population-based approaches apparently demonstrate the excellent performance of Reg-IDE.
Please use this identifier to cite or link to this item: