Mostrar el registro sencillo del ítem

Artículo

dc.creatorCichocki, Andrzej
dc.creatorCruces Álvarez, Sergio Antonio
dc.creatorAmari, Shun-ichi
dc.date.accessioned2015-12-10T13:07:56Z
dc.date.available2015-12-10T13:07:56Z
dc.date.issued2011
dc.identifier.issn1099-4300es
dc.identifier.urihttp://hdl.handle.net/11441/31765
dc.description.abstractWe propose a class of multiplicative algorithms for Nonnegative Matrix Factorization (NMF) which are robust with respect to noise and outliers. To achieve this, we formulate a new family generalized divergences referred to as the Alpha-Beta-divergences (AB-divergences), which are parameterized by the two tuning parameters, alpha and beta, and smoothly connect the fundamental Alpha-, Beta- and Gamma-divergences. By adjusting these tuning parameters, we show that a wide range of standard and new divergences can be obtained. The corresponding learning algorithms for NMF are shown to integrate and generalize many existing ones, including the Lee-Seung, ISRA (Image Space Reconstruction Algorithm), EMML (Expectation Maximization Maximum Likelihood), Alpha-NMF, and Beta-NMF. Owing to more degrees of freedom in tuning the parameters, the proposed family of AB-multiplicative NMF algorithms is shown to improve robustness with respect to noise and outliers. The analysis illuminates the links of between AB-divergence and other divergences, especially Gamma- and Itakura-Saito divergences.es
dc.formatapplication/pdfes
dc.language.isoenges
dc.publisherMultidisciplinary Digital Publishing Institutees
dc.relation.ispartofEntropy, 13(1), 134-170es
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectnonnegative matrix factorization (NMF)es
dc.subjectrobust multiplicative NMF algorithmses
dc.subjectsimilarity measureses
dc.subjectgeneralized divergenceses
dc.subjectAlpha-divergenceses
dc.subjectBeta-divergenceses
dc.subjectGamma-divergenceses
dc.subjectextended Itakura-Saito like divergenceses
dc.subjectgeneralized Kullback-Leibler divergencees
dc.titleGeneralized Alpha-Beta Divergences and Their Application to Robust Nonnegative Matrix Factorizationes
dc.typeinfo:eu-repo/semantics/articlees
dcterms.identifierhttps://ror.org/03yxnpp24
dc.type.versioninfo:eu-repo/semantics/publishedVersiones
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses
dc.contributor.affiliationUniversidad de Sevilla. Departamento de Teoría de la Señal y Comunicacioneses
dc.relation.publisherversionhttp://dx.doi.org/10.3390/e13010134es
dc.identifier.doihttp://dx.doi.org/10.3390/e13010134es
dc.identifier.idushttps://idus.us.es/xmlui/handle/11441/31765

FicherosTamañoFormatoVerDescripción
entropy-13-00134.pdf846.7KbIcon   [PDF] Ver/Abrir  

Este registro aparece en las siguientes colecciones

Mostrar el registro sencillo del ítem

Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Excepto si se señala otra cosa, la licencia del ítem se describe como: Attribution-NonCommercial-NoDerivatives 4.0 Internacional