Mostrar el registro sencillo del ítem

Ponencia

dc.creatorLara Benítez, Pedroes
dc.creatorGallego Ledesma, Luises
dc.creatorCarranza García, Manueles
dc.creatorLuna Romera, José Maríaes
dc.date.accessioned2022-02-18T10:05:33Z
dc.date.available2022-02-18T10:05:33Z
dc.date.issued2021
dc.identifier.citationLara Benítez, P., Gallego Ledesma, L., Carranza García, M. y Luna Romera, J.M. (2021). Evaluation of the transformer architecture for univariate time series forecasting. En CAEPIA 2021: 19th Conference of the Spanish Association for Artificial Intelligence (106-115), Málaga, España: Springer.
dc.identifier.isbn978-3-030-85712-7es
dc.identifier.issn0302-9743es
dc.identifier.urihttps://hdl.handle.net/11441/130056
dc.description.abstractThe attention-based Transformer architecture is earning in- creasing popularity for many machine learning tasks. In this study, we aim to explore the suitability of Transformers for time series forecasting, which is a crucial problem in di erent domains. We perform an extensive experimental study of the Transformer with di erent architecture and hyper-parameter con gurations over 12 datasets with more than 50,000 time series. The forecasting accuracy and computational e ciency of Transformers are compared with state-of-the-art deep learning networks such as LSTM and CNN. The obtained results demonstrate that Trans- formers can outperform traditional recurrent or convolutional models due to their capacity to capture long-term dependencies, obtaining the most accurate forecasts in ve out of twelve datasets. However, Transformers are generally more di cult to parametrize and show higher variability of results. In terms of e ciency, Transformer models proved to be less competitive in inference time and similar to the LSTM in training time.es
dc.description.sponsorshipMinisterio de Ciencia, Innovación y Universidades TIN2017-88209-C2es
dc.description.sponsorshipJunta de Andalucía US-1263341es
dc.description.sponsorshipJunta de Andalucía P18-RT-2778es
dc.formatapplication/pdfes
dc.format.extent10es
dc.language.isoenges
dc.publisherSpringeres
dc.relation.ispartofCAEPIA 2021: 19th Conference of the Spanish Association for Artificial Intelligence (2021), pp. 106-115.
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectTime serieses
dc.subjectForecastinges
dc.subjectAttentiones
dc.subjectTransformerses
dc.subjectDeep learninges
dc.titleEvaluation of the transformer architecture for univariate time series forecastinges
dc.typeinfo:eu-repo/semantics/conferenceObjectes
dcterms.identifierhttps://ror.org/03yxnpp24
dc.type.versioninfo:eu-repo/semantics/submittedVersiones
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses
dc.contributor.affiliationUniversidad de Sevilla. Departamento de Lenguajes y Sistemas Informáticoses
dc.relation.projectIDTIN2017-88209-C2es
dc.relation.projectIDUS-1263341es
dc.relation.projectIDP18-RT-2778es
dc.relation.publisherversionhttps://link.springer.com/chapter/10.1007/978-3-030-85713-4_11es
dc.identifier.doi10.1007/978-3-030-85713-4_11es
dc.publication.initialPage106es
dc.publication.endPage115es
dc.eventtitleCAEPIA 2021: 19th Conference of the Spanish Association for Artificial Intelligencees
dc.eventinstitutionMálaga, Españaes
dc.relation.publicationplaceCham, Switzerlandes
dc.contributor.funderMinisterio de Ciencia, Innovación y Universidades (MICINN). Españaes
dc.contributor.funderJunta de Andalucíaes

FicherosTamañoFormatoVerDescripción
CAEPIA_20_212.pdf299.1KbIcon   [PDF] Ver/Abrir  

Este registro aparece en las siguientes colecciones

Mostrar el registro sencillo del ítem

Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Excepto si se señala otra cosa, la licencia del ítem se describe como: Attribution-NonCommercial-NoDerivatives 4.0 Internacional