dc.creator | Lara Benítez, Pedro | es |
dc.creator | Gallego Ledesma, Luis | es |
dc.creator | Carranza García, Manuel | es |
dc.creator | Luna Romera, José María | es |
dc.date.accessioned | 2022-02-18T10:05:33Z | |
dc.date.available | 2022-02-18T10:05:33Z | |
dc.date.issued | 2021 | |
dc.identifier.citation | Lara Benítez, P., Gallego Ledesma, L., Carranza García, M. y Luna Romera, J.M. (2021). Evaluation of the transformer architecture for univariate time series forecasting. En CAEPIA 2021: 19th Conference of the Spanish Association for Artificial Intelligence (106-115), Málaga, España: Springer. | |
dc.identifier.isbn | 978-3-030-85712-7 | es |
dc.identifier.issn | 0302-9743 | es |
dc.identifier.uri | https://hdl.handle.net/11441/130056 | |
dc.description.abstract | The attention-based Transformer architecture is earning in-
creasing popularity for many machine learning tasks. In this study, we
aim to explore the suitability of Transformers for time series forecasting,
which is a crucial problem in di erent domains. We perform an extensive
experimental study of the Transformer with di erent architecture and
hyper-parameter con gurations over 12 datasets with more than 50,000
time series. The forecasting accuracy and computational e ciency of
Transformers are compared with state-of-the-art deep learning networks
such as LSTM and CNN. The obtained results demonstrate that Trans-
formers can outperform traditional recurrent or convolutional models due
to their capacity to capture long-term dependencies, obtaining the most
accurate forecasts in ve out of twelve datasets. However, Transformers
are generally more di cult to parametrize and show higher variability
of results. In terms of e ciency, Transformer models proved to be less
competitive in inference time and similar to the LSTM in training time. | es |
dc.description.sponsorship | Ministerio de Ciencia, Innovación y Universidades TIN2017-88209-C2 | es |
dc.description.sponsorship | Junta de Andalucía US-1263341 | es |
dc.description.sponsorship | Junta de Andalucía P18-RT-2778 | es |
dc.format | application/pdf | es |
dc.format.extent | 10 | es |
dc.language.iso | eng | es |
dc.publisher | Springer | es |
dc.relation.ispartof | CAEPIA 2021: 19th Conference of the Spanish Association for Artificial Intelligence (2021), pp. 106-115. | |
dc.rights | Attribution-NonCommercial-NoDerivatives 4.0 Internacional | * |
dc.rights.uri | http://creativecommons.org/licenses/by-nc-nd/4.0/ | * |
dc.subject | Time series | es |
dc.subject | Forecasting | es |
dc.subject | Attention | es |
dc.subject | Transformers | es |
dc.subject | Deep learning | es |
dc.title | Evaluation of the transformer architecture for univariate time series forecasting | es |
dc.type | info:eu-repo/semantics/conferenceObject | es |
dcterms.identifier | https://ror.org/03yxnpp24 | |
dc.type.version | info:eu-repo/semantics/submittedVersion | es |
dc.rights.accessRights | info:eu-repo/semantics/openAccess | es |
dc.contributor.affiliation | Universidad de Sevilla. Departamento de Lenguajes y Sistemas Informáticos | es |
dc.relation.projectID | TIN2017-88209-C2 | es |
dc.relation.projectID | US-1263341 | es |
dc.relation.projectID | P18-RT-2778 | es |
dc.relation.publisherversion | https://link.springer.com/chapter/10.1007/978-3-030-85713-4_11 | es |
dc.identifier.doi | 10.1007/978-3-030-85713-4_11 | es |
dc.publication.initialPage | 106 | es |
dc.publication.endPage | 115 | es |
dc.eventtitle | CAEPIA 2021: 19th Conference of the Spanish Association for Artificial Intelligence | es |
dc.eventinstitution | Málaga, España | es |
dc.relation.publicationplace | Cham, Switzerland | es |
dc.contributor.funder | Ministerio de Ciencia, Innovación y Universidades (MICINN). España | es |
dc.contributor.funder | Junta de Andalucía | es |