Beyond Language: Reframing LLMs in Africa Through Contextual Grounding
Authors: Gilbert Kiplangat Korir, Caroline Heta, Alison Okatch, Ibrahim Fadhili, Irene Jebet Korir, Moses Muiruri Njau, Asbel Rotich Kibet
This position paper reframes how AI development in Africa is approached. Current efforts emphasize language-inclusion training for Large Language Models (LLMs) to speak African languages such as Swahili, Yoruba, and Zulu. However, linguistic inclusion alone does not ensure contextual understanding. We propose a multidimensional Framework of African Contextual Dimensions—cultural-linguistic, socioeconomic, historical-political, and epistemic—to guide the design of contextually grounded AI systems.