González Díaz, RocíoGutiérrez Naranjo, Miguel ÁngelPaluzo Hidalgo, Eduardo2022-07-012022-07-012022González Díaz, R., Gutiérrez Naranjo, M.Á. y Paluzo Hidalgo, E. (2022). Topology-based representative datasets to reduce neural network training resources. Neural Computing and Applications, May 20221433-3058https://hdl.handle.net/11441/134919One of the main drawbacks of the practical use of neural networks is the long time required in the training process. Such a training process consists of an iterative change of parameters trying to minimize a loss function. These changes are driven by a dataset, which can be seen as a set of labeled points in an n-dimensional space. In this paper, we explore the concept of a representative dataset which is a dataset smaller than the original one, satisfying a nearness condition independent of isometric transformations. Representativeness is measured using persistence diagrams (a computational topology tool) due to its computational efficiency. We theoretically prove that the accuracy of a perceptron evaluated on the original dataset coincides with the accuracy of the neural network evaluated on the representative dataset when the neural network architecture is a perceptron, the loss function is the mean squared error, and certain conditions on the representativeness of the dataset are imposed. These theoretical results accompanied by experimentation open a door to reducing the size of the dataset to gain time in the training process of any neural networkapplication/pdf17engAttribution-NonCommercial-NoDerivatives 4.0 Internacionalhttp://creativecommons.org/licenses/by-nc-nd/4.0/Data reductionNeural networksRepresentative datasetsComputational topologyTopology-based representative datasets to reduce neural network training resourcesinfo:eu-repo/semantics/articleinfo:eu-repo/semantics/openAccesshttps://doi.org/10.1007/s00521-022-07252-y