Presentation
Evaluation of the transformer architecture for univariate time series forecasting
Author/s | Lara Benítez, Pedro
Gallego Ledesma, Luis Carranza García, Manuel ![]() ![]() ![]() ![]() ![]() ![]() Luna Romera, José María ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Department | Universidad de Sevilla. Departamento de Lenguajes y Sistemas Informáticos |
Publication Date | 2021 |
Deposit Date | 2022-02-18 |
Published in |
|
ISBN/ISSN | 978-3-030-85712-7 0302-9743 |
Abstract | The attention-based Transformer architecture is earning in-
creasing popularity for many machine learning tasks. In this study, we
aim to explore the suitability of Transformers for time series forecasting,
which is a ... The attention-based Transformer architecture is earning in- creasing popularity for many machine learning tasks. In this study, we aim to explore the suitability of Transformers for time series forecasting, which is a crucial problem in di erent domains. We perform an extensive experimental study of the Transformer with di erent architecture and hyper-parameter con gurations over 12 datasets with more than 50,000 time series. The forecasting accuracy and computational e ciency of Transformers are compared with state-of-the-art deep learning networks such as LSTM and CNN. The obtained results demonstrate that Trans- formers can outperform traditional recurrent or convolutional models due to their capacity to capture long-term dependencies, obtaining the most accurate forecasts in ve out of twelve datasets. However, Transformers are generally more di cult to parametrize and show higher variability of results. In terms of e ciency, Transformer models proved to be less competitive in inference time and similar to the LSTM in training time. |
Funding agencies | Ministerio de Ciencia, Innovación y Universidades (MICINN). España Junta de Andalucía |
Project ID. | TIN2017-88209-C2
![]() US-1263341 ![]() P18-RT-2778 ![]() |
Citation | Lara Benítez, P., Gallego Ledesma, L., Carranza García, M. y Luna Romera, J.M. (2021). Evaluation of the transformer architecture for univariate time series forecasting. En CAEPIA 2021: 19th Conference of the Spanish Association for Artificial Intelligence (106-115), Málaga, España: Springer. |
Files | Size | Format | View | Description |
---|---|---|---|---|
CAEPIA_20_212.pdf | 299.1Kb | ![]() | View/ | |