DC FieldValueLanguage
dc.contributor.authorJanković, Radmilaen_US
dc.contributor.authorAmelio, Alessiaen_US
dc.contributor.authorDraganov, Ivo R.en_US
dc.date.accessioned2022-05-09T12:20:01Z-
dc.date.available2022-05-09T12:20:01Z-
dc.date.issued2022-01-01-
dc.identifier.isbn9781665437783-
dc.identifier.urihttp://researchrepository.mi.sanu.ac.rs/handle/123456789/4800-
dc.description.abstractHandwriting recognition is a challenging task and with the advancements in the development of the deep learning such task can be performed even for very limited documents. This paper aims to perform writer identification and retrieval from historical documents using an ensemble of convolutional neural network models that were built using the Inception-ResNet-v2 pre-trained architecture. The dataset comprises 170 images grouped in 34 classes. The results prove that the ensemble model outperforms single pre-trained models, obtaining an accuracy of 96%.en_US
dc.publisherIEEEen_US
dc.subjectcultural heritage | deep learning | ensemble learning | historical documents | writer identificationen_US
dc.titleWriter Identification from Historical Documents Using Ensemble Deep Learning Transfer Modelsen_US
dc.typeConference Paperen_US
dc.relation.conference21st International Symposium INFOTEH-JAHORINA, INFOTEH 2022en_US
dc.identifier.doi10.1109/INFOTEH53737.2022.9751301-
dc.identifier.scopus2-s2.0-85128708629-
dc.contributor.affiliationComputer Scienceen_US
dc.contributor.affiliationMathematical Institute of the Serbian Academy of Sciences and Artsen_US
dc.relation.firstpage1-
dc.relation.lastpage5-
dc.description.rankM33-
item.cerifentitytypePublications-
item.openairetypeConference Paper-
item.grantfulltextnone-
item.fulltextNo Fulltext-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
crisitem.author.orcid0000-0003-3424-134X-
Show simple item record

SCOPUSTM   
Citations

2
checked on Dec 20, 2024

Page view(s)

26
checked on Dec 22, 2024

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.