DC FieldValueLanguage
dc.contributor.authorMatijević, Lukaen_US
dc.contributor.authorČorić, Rebekaen_US
dc.contributor.authorĐumić, Matejaen_US
dc.description.abstractIn recent years, machine learning, and in particular neural networks (NN) have received much attention due to their numerous real-world applications. NN training is an essential step in building a model that can make reliable predictions based on given data. The process of NN training aims to find the optimal values for its internal parameters so that the network performs well on test data according to a given metric. The most common way to train an NN is to successfully use an optimizer based on gradient descent (GD). At each epoch, the optimizer updates the parameters based on the given data. In this paper, we are interested in using metaheuristics to guide the entire training process. The main idea is to identify the promising regions of the search space and invoke a GD -based optimizer in these regions as a local search procedure. For this purpose, we applied metaheuristics such as Variable Neighborhood Search and the Memetic algorithm to the NN training process and measured their performance on publicly available classification datasets, using classification accuracy as an evaluation metric.en_US
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 International*
dc.titleMetaheuristic Approaches to neural networks trainingen_US
dc.typeConference Paperen_US
dc.relation.conferenceKOI 2022, Šibenik, Croatia, September 28-30, 2022en_US
dc.relation.publicationBook of Abstractsen_US
dc.contributor.affiliationComputer Scienceen_US
dc.contributor.affiliationMathematical Institute of the Serbian Academy of Sciences and Artsen_US
item.openairetypeConference Paper-
item.fulltextNo Fulltext-
Show simple item record

Page view(s)

checked on May 9, 2024

Google ScholarTM


This item is licensed under a Creative Commons License Creative Commons