Artigo Produção Nacional Revisado por pares

Optimization of neural networks through grammatical evolution and a genetic algorithm

2016; Elsevier BV; Volume: 56; Linguagem: Inglês

10.1016/j.eswa.2016.03.012

ISSN

1873-6793

Autores

Lídio Mauro Lima de Campos, Oliveira Júnior, Mauro Roisenberg,

Tópico(s)

Neural Networks and Applications

Resumo

This paper proposes a hybrid neuro-evolutive algorithm (NEA) that uses a compact indirect encoding scheme (IES) for representing its genotypes (a set of ten production rules of a Lindenmayer System with memory), moreover has the ability to reuse the genotypes and automatically build modular, hierarchical and recurrent neural networks. A genetic algorithm (GA) evolves a Lindenmayer System (L-System) that is used to design the neural network's architecture. This basic neural codification confers scalability and search space reduction in relation to other methods. Furthermore, the system uses a parallel genome scan engine that increases both the implicit parallelism and convergence of the GA. The fitness function of the NEA rewards economical artificial neural networks (ANNs) that are easily implemented. The NEA was tested on five real-world classification datasets and three well-known datasets for time series forecasting (TSF). The results are statistically compared against established state-of-the-art algorithms and various forecasting methods (ADANN, ARIMA, UCM, and Forecast Pro). In most cases, our NEA outperformed the other methods, delivering the most accurate classification and time series forecasting with the least computational effort. These superior results are attributed to the improved effectiveness and efficiency of NEA in the decision-making process. The result is an optimized neural network architecture for solving classification problems and simulating dynamical systems.

Referência(s)
Altmetric
PlumX