Artigo Acesso aberto Revisado por pares

Temporal Fusion Transformers for interpretable multi-horizon time series forecasting

2021; Elsevier BV; Volume: 37; Issue: 4 Linguagem: Inglês

10.1016/j.ijforecast.2021.03.012

ISSN

1872-8200

Autores

Bryan Lim, Sercan Ö. Arık, Nicolas Loeff, Tomas Pfister,

Tópico(s)

Energy Load and Power Forecasting

Resumo

Multi-horizon forecasting often contains a complex mix of inputs – including static (i.e. time-invariant) covariates, known future inputs, and other exogenous time series that are only observed in the past – without any prior information on how they interact with the target. Several deep learning methods have been proposed, but they are typically 'black-box' models that do not shed light on how they use the full range of inputs present in practical scenarios. In this paper, we introduce the Temporal Fusion Transformer (TFT) – a novel attention-based architecture that combines high-performance multi-horizon forecasting with interpretable insights into temporal dynamics. To learn temporal relationships at different scales, TFT uses recurrent layers for local processing and interpretable self-attention layers for long-term dependencies. TFT utilizes specialized components to select relevant features and a series of gating layers to suppress unnecessary components, enabling high performance in a wide range of scenarios. On a variety of real-world datasets, we demonstrate significant performance improvements over existing benchmarks, and highlight three practical interpretability use cases of TFT.

Referência(s)