This thesisintroduces "FromLanguagetoLearning," a unified,LLM-driven frameworkdesignedtoaddressthecriticalchallengeofcreatingscalable,personalized educationalcontentacrossdiverselinguisticandpedagogicalcontexts.Itproposesa systematic methodologyforadaptinggeneral-purposelargelanguagemodels(LLMs) into specialized,open-sourceeducationalaids,particularlyforunder-resourcedlan- guages. The cornerstoneofthisframeworkisatask-specificadaptationofdistillationforcre- ating high-quality,task-specificinstructionaldatasets—the Instruct series. First, a state-of-the-artLLMispromptedtogeneratestructurededucationalmaterialsfrom raw,unstructuredtext.Thissyntheticdataisthenusedtofine-tunesmaller,open- sourcemodels,efficientlytransferringsophisticatedcapabilitiesforspecificlearning objectives. The framework’spedagogicalversatilityisdemonstratedthroughaprogressionofap- plications.Tofosterengagementandvocabularyreinforcement,wedevelopedsys- tems forgeneratingeducationalcrosswordpuzzlesin English, Italian,Arabic,and Turkish. Evolvingfromgamificationtoformalassessment,theframeworkwasthen extendedtocreatemodelsforgeneratingmultiple-choiceandshort-answerquestions in Persian andTurkish. Finally,toprovidedirectpedagogicalintervention,the frameworkwasappliedtothenuancedtaskofdeliveringautomatedsyntaxfeedback on studentessaysin English. The efficacyofthisapproachisvalidatedthroughsystematichumanandautomatic evaluations foreachlanguageandtask.Theresultsconsistentlyandsignificantly show thatfine-tuningsmallerLLMsonourgenerated Instruct datasets makes them highlyeffectiveandreliablespecializedtools.Ultimately,thisthesiscontrib- utes ascalableandvalidatedblueprintthatbridgesthegapbetweengeneral-purpose language modelsandthespecificneedsofmultilingualeducation,democratizingthe creationofnext-generation,AI-poweredlearningtechnologies.Acrossthisthesis,we produced 11 publications,released 11 datasets (including 7 in theInstructseries), and trained/evaluated 29 fine-tunedgeneratormodelsacross 5 languages, enabling reproducible,multilingualeducationalNLPatscale.Allcode,datasets,andmod- els developedinthisthesisarepubliclyavailabletotheresearchcommunityat: https: //github.com/KamyarZeinalipour/From-Language-to-Learning

Zeinalipour, K. (2026). From Language to Learning: An LLM-Driven Framework for Multilingual Educational Content.

From Language to Learning: An LLM-Driven Framework for Multilingual Educational Content

Kamyar Zeinalipour
2026-02-10

Abstract

This thesisintroduces "FromLanguagetoLearning," a unified,LLM-driven frameworkdesignedtoaddressthecriticalchallengeofcreatingscalable,personalized educationalcontentacrossdiverselinguisticandpedagogicalcontexts.Itproposesa systematic methodologyforadaptinggeneral-purposelargelanguagemodels(LLMs) into specialized,open-sourceeducationalaids,particularlyforunder-resourcedlan- guages. The cornerstoneofthisframeworkisatask-specificadaptationofdistillationforcre- ating high-quality,task-specificinstructionaldatasets—the Instruct series. First, a state-of-the-artLLMispromptedtogeneratestructurededucationalmaterialsfrom raw,unstructuredtext.Thissyntheticdataisthenusedtofine-tunesmaller,open- sourcemodels,efficientlytransferringsophisticatedcapabilitiesforspecificlearning objectives. The framework’spedagogicalversatilityisdemonstratedthroughaprogressionofap- plications.Tofosterengagementandvocabularyreinforcement,wedevelopedsys- tems forgeneratingeducationalcrosswordpuzzlesin English, Italian,Arabic,and Turkish. Evolvingfromgamificationtoformalassessment,theframeworkwasthen extendedtocreatemodelsforgeneratingmultiple-choiceandshort-answerquestions in Persian andTurkish. Finally,toprovidedirectpedagogicalintervention,the frameworkwasappliedtothenuancedtaskofdeliveringautomatedsyntaxfeedback on studentessaysin English. The efficacyofthisapproachisvalidatedthroughsystematichumanandautomatic evaluations foreachlanguageandtask.Theresultsconsistentlyandsignificantly show thatfine-tuningsmallerLLMsonourgenerated Instruct datasets makes them highlyeffectiveandreliablespecializedtools.Ultimately,thisthesiscontrib- utes ascalableandvalidatedblueprintthatbridgesthegapbetweengeneral-purpose language modelsandthespecificneedsofmultilingualeducation,democratizingthe creationofnext-generation,AI-poweredlearningtechnologies.Acrossthisthesis,we produced 11 publications,released 11 datasets (including 7 in theInstructseries), and trained/evaluated 29 fine-tunedgeneratormodelsacross 5 languages, enabling reproducible,multilingualeducationalNLPatscale.Allcode,datasets,andmod- els developedinthisthesisarepubliclyavailabletotheresearchcommunityat: https: //github.com/KamyarZeinalipour/From-Language-to-Learning
10-feb-2026
XXXVIII
Zeinalipour, K. (2026). From Language to Learning: An LLM-Driven Framework for Multilingual Educational Content.
Zeinalipour, Kamyar
File in questo prodotto:
File Dimensione Formato  
phd_unisi_131356.pdf

accesso aperto

Licenza: PUBBLICO - Pubblico con Copyright
Dimensione 17.47 MB
Formato Adobe PDF
17.47 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/1308994