Large Language Models (LLMs) have demonstrated exceptional capabilities in understanding and generating human-like text. In this paper, we leverage their powerful skills in the scope of lifelong learning agents. Instead of relying on fine-tuning procedures, we exploit Temporal Knowledge Graphs (TKGs) to continually store and update fresh information. In particular, we introduce a novel in-context learning approach called Continual In-context Knowledge LLM (CIK-LLM) capable of bridging an LLM with a dynamically changing TKG. The graph is updated whenever new knowledge becomes available, while the LLM is instructed to find the relational paths that are most relevant to the input instruction, with the goal of identifying smaller subgraphs of evidences. We propose to encode the subgraphs in a compressed-and-prompt-friendly manner, efficiently bridging the LLMs and TKGs. Then, the LLM provides an answer which is conditioned to the knowledge in the graph, exploiting its skills to support the reasoning process. We evaluate our approach on a TKG Question Answering benchmark which includes questions about events that happened at different times. The same questions are asked to models equipped with obsolete or incomplete information and models including progressively more up-to-date knowledge. CIK-LLM overcomes pre-trained LLMs, being able to immediately adapt to newly accumulated knowledge, and it reaches performances that are not far from the ones of a state-of-the art model which is trained not only exploiting LLMs but also large datasets of questions and answers. Furthermore, our model represents a valuable “forgetting-free” approach to quickly adapt an LLM to novel domains without any fine-tuning, QA datasets, or incremental learning procedures.

Di Maio, C., Zugarini, A., Giannini, F., Maggini, M., Melacci, S. (2024). Tomorror Brings Greater Knowledge: Large Language Models Join Dynamic Temporal Knowledge Graphs. In Proceedings of Machine Learning Researc (pp.560-576). ML Research Press.

Tomorror Brings Greater Knowledge: Large Language Models Join Dynamic Temporal Knowledge Graphs

Di Maio, C.;Maggini, M.;Melacci, S.
2024-01-01

Abstract

Large Language Models (LLMs) have demonstrated exceptional capabilities in understanding and generating human-like text. In this paper, we leverage their powerful skills in the scope of lifelong learning agents. Instead of relying on fine-tuning procedures, we exploit Temporal Knowledge Graphs (TKGs) to continually store and update fresh information. In particular, we introduce a novel in-context learning approach called Continual In-context Knowledge LLM (CIK-LLM) capable of bridging an LLM with a dynamically changing TKG. The graph is updated whenever new knowledge becomes available, while the LLM is instructed to find the relational paths that are most relevant to the input instruction, with the goal of identifying smaller subgraphs of evidences. We propose to encode the subgraphs in a compressed-and-prompt-friendly manner, efficiently bridging the LLMs and TKGs. Then, the LLM provides an answer which is conditioned to the knowledge in the graph, exploiting its skills to support the reasoning process. We evaluate our approach on a TKG Question Answering benchmark which includes questions about events that happened at different times. The same questions are asked to models equipped with obsolete or incomplete information and models including progressively more up-to-date knowledge. CIK-LLM overcomes pre-trained LLMs, being able to immediately adapt to newly accumulated knowledge, and it reaches performances that are not far from the ones of a state-of-the art model which is trained not only exploiting LLMs but also large datasets of questions and answers. Furthermore, our model represents a valuable “forgetting-free” approach to quickly adapt an LLM to novel domains without any fine-tuning, QA datasets, or incremental learning procedures.
2024
Di Maio, C., Zugarini, A., Giannini, F., Maggini, M., Melacci, S. (2024). Tomorror Brings Greater Knowledge: Large Language Models Join Dynamic Temporal Knowledge Graphs. In Proceedings of Machine Learning Researc (pp.560-576). ML Research Press.
File in questo prodotto:
File Dimensione Formato  
maio25a.pdf

accesso aperto

Tipologia: PDF editoriale
Licenza: PUBBLICO - Pubblico con Copyright
Dimensione 3.67 MB
Formato Adobe PDF
3.67 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/1300994