Mostrar el registro sencillo del ítem

Artículo

dc.creatorGonzález Díaz, Rocíoes
dc.creatorGutiérrez Naranjo, Miguel Ángeles
dc.creatorPaluzo Hidalgo, Eduardoes
dc.date.accessioned2022-07-01T11:15:28Z
dc.date.available2022-07-01T11:15:28Z
dc.date.issued2022
dc.identifier.citationGonzález Díaz, R., Gutiérrez Naranjo, M.Á. y Paluzo Hidalgo, E. (2022). Topology-based representative datasets to reduce neural network training resources. Neural Computing and Applications, May 2022
dc.identifier.issn1433-3058es
dc.identifier.urihttps://hdl.handle.net/11441/134919
dc.description.abstractOne of the main drawbacks of the practical use of neural networks is the long time required in the training process. Such a training process consists of an iterative change of parameters trying to minimize a loss function. These changes are driven by a dataset, which can be seen as a set of labeled points in an n-dimensional space. In this paper, we explore the concept of a representative dataset which is a dataset smaller than the original one, satisfying a nearness condition independent of isometric transformations. Representativeness is measured using persistence diagrams (a computational topology tool) due to its computational efficiency. We theoretically prove that the accuracy of a perceptron evaluated on the original dataset coincides with the accuracy of the neural network evaluated on the representative dataset when the neural network architecture is a perceptron, the loss function is the mean squared error, and certain conditions on the representativeness of the dataset are imposed. These theoretical results accompanied by experimentation open a door to reducing the size of the dataset to gain time in the training process of any neural networkes
dc.description.sponsorshipAgencia Estatal de Investigación PID2019-107339GB-100es
dc.description.sponsorshipAgencia Andaluza del Conocimiento P20-01145es
dc.formatapplication/pdfes
dc.format.extent17es
dc.language.isoenges
dc.publisherSpringeres
dc.relation.ispartofNeural Computing and Applications, May 2022
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectData reductiones
dc.subjectNeural networkses
dc.subjectRepresentative datasetses
dc.subjectComputational topologyes
dc.titleTopology-based representative datasets to reduce neural network training resourceses
dc.typeinfo:eu-repo/semantics/articlees
dc.type.versioninfo:eu-repo/semantics/publishedVersiones
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses
dc.contributor.affiliationUniversidad de Sevilla. Departamento de Matemática Aplicada I (ETSII)es
dc.contributor.affiliationUniversidad de Sevilla. Departamento de Ciencias de la Computación e Inteligencia Artificiales
dc.relation.projectIDPID2019-107339GB-100es
dc.relation.projectIDP20-01145es
dc.relation.publisherversionhttps://link.springer.com/article/10.1007/s00521-022-07252-yes
dc.identifier.doi10.1007/s00521-022-07252-yes
dc.contributor.groupUniversidad de Sevilla. TIC193 : Computación Naturales
dc.contributor.groupUniversidad de Sevilla. FQM-369: Combinatorial Image Analysises
dc.journaltitleNeural Computing and Applicationses
dc.publication.issueMay 2022es
dc.contributor.funderAgencia Estatal de Investigación. Españaes
dc.contributor.funderAgencia Andaluza del Conocimientoes

FicherosTamañoFormatoVerDescripción
Gonzalez-Diaz2022_Article_Topo ...2.291MbIcon   [PDF] Ver/Abrir  

Este registro aparece en las siguientes colecciones

Mostrar el registro sencillo del ítem

Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Excepto si se señala otra cosa, la licencia del ítem se describe como: Attribution-NonCommercial-NoDerivatives 4.0 Internacional