Repositorio de producción científica de la Universidad de Sevilla

An Event-Driven Classifier for Spiking Neural Networks Fed with Synthetic or Dynamic Vision Sensor Data

Opened Access An Event-Driven Classifier for Spiking Neural Networks Fed with Synthetic or Dynamic Vision Sensor Data

Citas

buscar en

Estadísticas
Icon
Exportar a
Autor: Stromatias, Evangelos
Soto, Miguel
Serrano Gotarredona, María Teresa
Linares Barranco, Bernabé
Fecha: 2017
Publicado en: Frontiers in Neuroscience, 11 (artículo 350), 1-17.
Tipo de documento: Artículo
Resumen: This paper introduces a novel methodology for training an event-driven classifier within a Spiking Neural Network (SNN) System capable of yielding good classification results when using both synthetic input data and real data captured from Dynamic Vision Sensor (DVS) chips. The proposed supervised method uses the spiking activity provided by an arbitrary topology of prior SNN layers to build histograms and train the classifier in the frame domain using the stochastic gradient descent algorithm. In addition, this approach can cope with leaky integrate-and-fire neuron models within the SNN, a desirable feature for real-world SNN applications, where neural activation must fade away after some time in the absence of inputs. Consequently, this way of building histograms captures the dynamics of spikes immediately before the classifier. We tested our method on the MNIST data set using different synthetic encodings and real DVS sensory data sets such as N-MNIST, MNIST-DVS, and Poker-DVS usin...
[Ver más]
Cita: Stromatias, E., Soto, M., Serrano Gotarredona, M.T. y Linares Barranco, B. (2017). An Event-Driven Classifier for Spiking Neural Networks Fed with Synthetic or Dynamic Vision Sensor Data. Frontiers in Neuroscience, 11 (artículo 350), 1-17.
Tamaño: 4.858Mb
Formato: PDF

URI: http://hdl.handle.net/11441/64029

DOI: 10.3389/fnins.2017.00350

Ver versión del editor

Mostrar el registro completo del ítem


Esta obra está bajo una Licencia Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 Internacional

Este registro aparece en las siguientes colecciones