Ponencia
Evaluation of the transformer architecture for univariate time series forecasting
Autor/es | Lara Benítez, Pedro
Gallego Ledesma, Luis Carranza García, Manuel Luna Romera, José María |
Departamento | Universidad de Sevilla. Departamento de Lenguajes y Sistemas Informáticos |
Fecha de publicación | 2021 |
Fecha de depósito | 2022-02-18 |
Publicado en |
|
ISBN/ISSN | 978-3-030-85712-7 0302-9743 |
Resumen | The attention-based Transformer architecture is earning in-
creasing popularity for many machine learning tasks. In this study, we
aim to explore the suitability of Transformers for time series forecasting,
which is a ... The attention-based Transformer architecture is earning in- creasing popularity for many machine learning tasks. In this study, we aim to explore the suitability of Transformers for time series forecasting, which is a crucial problem in di erent domains. We perform an extensive experimental study of the Transformer with di erent architecture and hyper-parameter con gurations over 12 datasets with more than 50,000 time series. The forecasting accuracy and computational e ciency of Transformers are compared with state-of-the-art deep learning networks such as LSTM and CNN. The obtained results demonstrate that Trans- formers can outperform traditional recurrent or convolutional models due to their capacity to capture long-term dependencies, obtaining the most accurate forecasts in ve out of twelve datasets. However, Transformers are generally more di cult to parametrize and show higher variability of results. In terms of e ciency, Transformer models proved to be less competitive in inference time and similar to the LSTM in training time. |
Agencias financiadoras | Ministerio de Ciencia, Innovación y Universidades (MICINN). España Junta de Andalucía |
Identificador del proyecto | TIN2017-88209-C2
US-1263341 P18-RT-2778 |
Cita | Lara Benítez, P., Gallego Ledesma, L., Carranza García, M. y Luna Romera, J.M. (2021). Evaluation of the transformer architecture for univariate time series forecasting. En CAEPIA 2021: 19th Conference of the Spanish Association for Artificial Intelligence (106-115), Málaga, España: Springer. |
Ficheros | Tamaño | Formato | Ver | Descripción |
---|---|---|---|---|
CAEPIA_20_212.pdf | 299.1Kb | [PDF] | Ver/ | |