Skip to contents

An R implementation of tft: Temporal Fusion Transformer.

The Temporal Fusion Transformer is a neural network architecture proposed by Bryan Lim et al. with the goal of making multi-horizon time series forecasts for multiple time series in a single model.

The main difference between TFT and conventional forecasting methodologies is the way its architecture allows encoding different types of input data that can exist in forecasting problems. For instance, the model allows handling static covariates and time varying (known and unknown) differently. tft also showed promising benchmarks.

The code in this repository is heavily inspired in code from akeskiner/Temporal_Fusion_Transform, jdb78/pytorch-forecasting and the original implementation here.

Installation

You can install the development version GitHub with:

# install.packages("remotes")
remotes::install_github("mlverse/tft")

Read the Getting Started guide to fit your first model with tft.