Mostrar el registro sencillo del ítem

Ponencia

dc.creatorCamuñas Mesa, Luis Alejandroes
dc.creatorPérez Carrasco, José Antonioes
dc.creatorZamarreño Ramos, Carloses
dc.creatorSerrano Gotarredona, María Teresaes
dc.creatorLinares Barranco, Bernabées
dc.date.accessioned2020-10-22T10:05:58Z
dc.date.available2020-10-22T10:05:58Z
dc.date.issued2010
dc.identifier.citationCamuñas Mesa, L.A., Pérez Carrasco, J.A., Zamarreño Ramos, C., Serrano Gotarredona, M.T. y Linares Barranco, B. (2010). Neocortical frame-free vision sensing and processing through scalable Spiking ConvNet hardware. En IJCNN 2010 : International Joint Conference on Neural Networks Barcelona, España: IEEE Computer Society.
dc.identifier.isbn978-1-4244-6916-1es
dc.identifier.issn2161-4393es
dc.identifier.urihttps://hdl.handle.net/11441/102143
dc.description.abstractThis paper summarizes how Convolutional Neural Networks (ConvNets) can be implemented in hardware using Spiking neural network Address-Event-Representation (AER) technology, for sophisticated pattern and object recognition tasks operating at mili second delay throughputs. Although such hardware would require hundreds of individual convolutional modules and thus is presently not yet available, we discuss methods and technologies for implementing it in the near future. On the other hand, we provide precise behavioral simulations of large scale spiking AER convolutional hardware and evaluate its performance, by using performance figures of already available AER convolution chips fed with real sensory data obtained from physically available AER motion retina chips. We provide simulation results of systems trained for people recognition, showing recognition delays of a few miliseconds from stimulus onset. ConvNets show good up scaling behavior and possibilities for being implemented efficiently with new nano scale hybrid CMOS/nonCMOS technologies.es
dc.description.sponsorshipEuropean Union 216777 (NABAB)es
dc.description.sponsorshipMinisterio de Educación y Ciencia TEC2006-11730-C03-01es
dc.description.sponsorshipMinisterio de Economía y Competitividad TEC2009-10639-C04-01es
dc.description.sponsorshipJunta de Andalucía P06-TIC-01417es
dc.formatapplication/pdfes
dc.format.extent8es
dc.language.isoenges
dc.publisherIEEE Computer Societyes
dc.relation.ispartofIJCNN 2010 : International Joint Conference on Neural Networks (2010),
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.titleNeocortical frame-free vision sensing and processing through scalable Spiking ConvNet hardwarees
dc.typeinfo:eu-repo/semantics/conferenceObjectes
dcterms.identifierhttps://ror.org/03yxnpp24
dc.type.versioninfo:eu-repo/semantics/submittedVersiones
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses
dc.contributor.affiliationUniversidad de Sevilla. Departamento de Arquitectura y Tecnología de Computadoreses
dc.contributor.affiliationUniversidad de Sevilla. Departamento de Electrónica y Electromagnetismoes
dc.contributor.affiliationUniversidad de Sevilla. Departamento de Teoría de la señal y Comunicaciones
dc.relation.projectID216777 (NABAB)es
dc.relation.projectIDTEC2006-11730-C03-01es
dc.relation.projectIDTEC2009-10639-C04-01es
dc.relation.projectIDP06-TIC-01417es
dc.relation.publisherversionhttps://ieeexplore.ieee.org/document/5596366es
dc.identifier.doi10.1109/IJCNN.2010.5596366es
dc.eventtitleIJCNN 2010 : International Joint Conference on Neural Networkses
dc.eventinstitutionBarcelona, Españaes
dc.relation.publicationplaceNew York, USAes
dc.contributor.funderEuropean Union (UE)es
dc.contributor.funderMinisterio de Educación y Ciencia (MEC). Españaes
dc.contributor.funderMinisterio de Economía y Competitividad (MINECO). Españaes
dc.contributor.funderJunta de Andalucíaes

FicherosTamañoFormatoVerDescripción
Neocortical frame-free vision ...1.528MbIcon   [PDF] Ver/Abrir  

Este registro aparece en las siguientes colecciones

Mostrar el registro sencillo del ítem

Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Excepto si se señala otra cosa, la licencia del ítem se describe como: Attribution-NonCommercial-NoDerivatives 4.0 Internacional