Mostrar el registro sencillo del ítem

Artículo

dc.creatorLinares Barranco, Alejandroes
dc.creatorLiu, Hongjiees
dc.creatorRíos Navarro, José Antonioes
dc.creatorGómez Rodríguez, Francisco de Asíses
dc.creatorMoeys, Diederick P.es
dc.creatorDelbruck, Tobies
dc.date.accessioned2018-07-20T09:48:47Z
dc.date.available2018-07-20T09:48:47Z
dc.date.issued2018
dc.identifier.citationLinares Barranco, A., Liu, H., Rios Navarro, A., Gómez Rodríguez, F.d.A., Moeys, D.P. y Delbruck, T. (2018). Approaching Retinal Ganglion Cell Modeling and FPGA Implementation for Robotics. Entropy, 20 (6)
dc.identifier.issn1099-4300es
dc.identifier.urihttps://hdl.handle.net/11441/77484
dc.description.abstractTaking inspiration from biology to solve engineering problems using the organizing principles of biological neural computation is the aim of the field of neuromorphic engineering. This field has demonstrated success in sensor based applications (vision and audition) as well as in cognition and actuators. This paper is focused on mimicking the approaching detection functionality of the retina that is computed by one type of Retinal Ganglion Cell (RGC) and its application to robotics. These RGCs transmit action potentials when an expanding object is detected. In this work we compare the software and hardware logic FPGA implementations of this approaching function and the hardware latency when applied to robots, as an attention/reaction mechanism. The visual input for these cells comes from an asynchronous event-driven Dynamic Vision Sensor, which leads to an end-to-end event based processing system. The software model has been developed in Java, and computed with an average processing time per event of 370 ns on a NUC embedded computer. The output firing rate for an approaching object depends on the cell parameters that represent the needed number of input events to reach the firing threshold. For the hardware implementation, on a Spartan 6 FPGA, the processing time is reduced to 160 ns/event with the clock running at 50 MHz. The entropy has been calculated to demonstrate that the system is not totally deterministic in response to approaching objects because of several bioinspired characteristics. It has been measured that a Summit XL mobile robot can react to an approaching object in 90 ms, which can be used as an attentional mechanism. This is faster than similar event-based approaches in robotics and equivalent to human reaction latencies to visual stimulus.es
dc.description.sponsorshipMinisterio de Economía y Competitividad TEC2016-77785-Pes
dc.description.sponsorshipComisión Europea FP7-ICT-600954es
dc.formatapplication/pdfes
dc.language.isoenges
dc.publisherMDPIes
dc.relation.ispartofEntropy, 20 (6)
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectNeuromorphic engineeringes
dc.subjectEvent-based processinges
dc.subjectAddress-Event-Representationes
dc.subjectDynamic Vision Sensores
dc.subjectApproach sensitivity celles
dc.subjectRetina Ganglion Celles
dc.subjectRobotices
dc.subjectFPGAes
dc.titleApproaching Retinal Ganglion Cell Modeling and FPGA Implementation for Roboticses
dc.typeinfo:eu-repo/semantics/articlees
dcterms.identifierhttps://ror.org/03yxnpp24
dc.type.versioninfo:eu-repo/semantics/publishedVersiones
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses
dc.contributor.affiliationUniversidad de Sevilla. Departamento de Arquitectura y Tecnología de Computadoreses
dc.relation.projectIDTEC2016-77785-Pes
dc.relation.projectIDFP7-ICT-600954es
dc.relation.publisherversionhttp://www.mdpi.com/1099-4300/20/6/475es
dc.identifier.doi10.3390/e20060475es
idus.format.extent13es
dc.journaltitleEntropyes
dc.publication.volumen20es
dc.publication.issue6es
dc.contributor.funderMinisterio de Economía y Competitividad (MINECO). España
dc.contributor.funderEuropean Union (UE). FP7

FicherosTamañoFormatoVerDescripción
entropy-20-00475.pdf10.21MbIcon   [PDF] Ver/Abrir  

Este registro aparece en las siguientes colecciones

Mostrar el registro sencillo del ítem

Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Excepto si se señala otra cosa, la licencia del ítem se describe como: Attribution-NonCommercial-NoDerivatives 4.0 Internacional