Authors: Todorović, Branimir
Stanković, Miomir
Moraga, Claudio
Title: Recurrent neural networks training using derivative free nonlinear Bayesian filters
Journal: Studies in Computational Intelligence
Volume: 620
First page: 383
Last page: 410
Issue Date: 1-Jan-2016
ISSN: 1860949X
DOI: 10.1007/978-3-319-26393-9_23
URL: https://api.elsevier.com/content/abstract/scopus_id/84949907787
Abstract: 
© Springer International Publishing Switzerland 2016. We have implemented the recurrent neural networks training algorithms as joint estimation of synaptic weights and neuron outputs using approximate nonlinear recursive Bayesian estimators. We have considered two nonlinear derivative free estimators: Divided Difference Filter and Unscented Kalman filter and compared there computational efficiency and performances to the Extended Kalman Filter as training algorithms for different recurrent neural network architectures. Algorithms and architectures were tested on problems of long term, chaotic time series prediction.

Show full item record

SCOPUSTM   
Citations

2
checked on Nov 19, 2024

Page view(s)

13
checked on Nov 19, 2024

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.