In this paper we address the problem of trajectory prediction, focusing on memory-based models. Such methods are trained to collect a set of useful samples that can be retrieved and used at test time to condition predictions. We propose Explainable Sparse Attention (ESA), a module that can be seamlessly plugged-in into several existing memory-based state of the art predictors. ESA generates a sparse attention in memory, thus selecting a small subset of memory entries that are relevant for the observed trajectory. This enables an explanation of the model’s predictions with reference to previously observed training samples. Furthermore, we demonstrate significant improvements on three trajectory prediction datasets.

Marchetti, F., Becattini, F., Seidenari, L., Del Bimbo, A. (2023). Explainable Sparse Attention for Memory-Based Trajectory Predictors. In Computer Vision – ECCV 2022 Workshops. ECCV 2022 (pp.543-560). Cham : Springer [10.1007/978-3-031-25072-9_37].

Explainable Sparse Attention for Memory-Based Trajectory Predictors

Becattini F.;
2023-01-01

Abstract

In this paper we address the problem of trajectory prediction, focusing on memory-based models. Such methods are trained to collect a set of useful samples that can be retrieved and used at test time to condition predictions. We propose Explainable Sparse Attention (ESA), a module that can be seamlessly plugged-in into several existing memory-based state of the art predictors. ESA generates a sparse attention in memory, thus selecting a small subset of memory entries that are relevant for the observed trajectory. This enables an explanation of the model’s predictions with reference to previously observed training samples. Furthermore, we demonstrate significant improvements on three trajectory prediction datasets.
2023
978-3-031-25071-2
978-3-031-25072-9
Marchetti, F., Becattini, F., Seidenari, L., Del Bimbo, A. (2023). Explainable Sparse Attention for Memory-Based Trajectory Predictors. In Computer Vision – ECCV 2022 Workshops. ECCV 2022 (pp.543-560). Cham : Springer [10.1007/978-3-031-25072-9_37].
File in questo prodotto:
File Dimensione Formato  
MANTRA_Transformer_ECCVw (3).pdf

non disponibili

Tipologia: Pre-print
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 2.29 MB
Formato Adobe PDF
2.29 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/1230156