Gadget

Africa’s multilingual small language model goes smaller

Africa’s first multilingual Small Language Model (SLM), InkubaLM, has achieved a 75% reduction in size, without losing performance, thanks to African AI talent. 

The milestone marks a leap forward in accessible, efficient AI for low-resource environments across the continent.

The breakthrough came through the Buzuzu-Mavi Challenge, hosted by Lelapa AI in partnership with Zindi, which called on machine learning experts across the globe to compress InkubaLM while maintaining its multilingual capabilities. Over 490 participants from 61 countries responded, with all top winners hailing from Africa, a strong endorsement of the continent’s AI innovation potential.

Why Smaller Models Matter 

In a continent where internet access averages just 33% and 70% of people use entry-level smartphones, lightweight AI isn’t a luxury, it’s a lifeline. Smaller models, like InkubaLM, can run on affordable devices, function without constant connectivity, and power real-world solutions in translation, education, agriculture, and customer service.

The challenge drew global participation, with developers showcasing advanced techniques such as model distillation (training a smaller model to mimic a larger one), quantisation (shrinking the model by making the numbers it uses simpler), and low-rank adaptation (only adjusting parts of the model instead of the whole thing), all under strict hardware and compute constraints. The focus was on machine translation for Swahili and Hausa, two widely spoken yet underrepresented African languages in the AI space.

Photo supplied.

Meet the Winners 

The winners were:

🥇 Yvan Carré (Cameroon): Compressed InkubaLM using adapter heads (add-ons that specialise in specific tasks), quantisation (shrinking the model’s memory needs), and knowledge distillation (training a smaller model to mimic a larger one), making it leaner without sacrificing capability.

🥈 Stefan Strydom (South Africa): Cut down the model to just 40M parameters by trimming vocabulary (removing infrequent words), reducing layers (streamlining the structure), and sharing embeddings (reusing components to save space).

🥉 Team AI_Buzz – Abdourahamane Ide Salifou, Mubarak Muhammad, and Victor Olufemi (Nigeria & Niger): Built a 177M-parameter student model by blending datasets (combining different sources for broader learning) and applying distillation, achieving both size reduction and solid performance.

A step forward for African AI 

“This challenge isn’t simply about technical progress, it reflects our deeper mission at Lelapa AI: to build AI that is inclusive, accessible, and grounded in African realities,” says Lelapa AI CEO Pelonomi Moiloa. 

“The Buzuzu-Mavi Challenge affirms what we’ve always believed: when AI is designed with Africa in mind, it becomes both technically excellent and deeply transformative. And when African talent is trusted with meaningful challenges, the results are not just outstanding, they’re a glimpse into the future we’re building for and from the continent.”

Celina Lee, co-founder and CEO of Zindi, said: “Seeing the impact that our incredible community of AI builders can have on a truly African problem is inspiring and rewarding in its own right, but even better, these solutions showcase what African innovators can do in the language model space. When the state of the art requires ever larger language models, we’re proud to show the world that more can be done with less.”

The most promising submissions will inform future InkubaLM releases.

Why smaller models matter for Africa 

Across the continent, many communities experience low-connectivity environments, limited compute access, and cost-sensitive devices. Smaller AI models aren’t just convenient, they’re critical, says Lelapa AI (in an AI-generated release). 

This makes models like InkubaLM especially powerful in places where infrastructure challenges make traditional AI tools inaccessible. Smaller models bring voice assistants, real-time translation, and inclusive AI tools closer to everyday African users.

From helping farmers access climate information in their home language, to powering multilingual customer support, to enabling school children to interact with educational content with level-entry smartphones, InkubaLM opens the door to practical AI use across sectors.

InkubaLM is a foundational model, an essential step in building robust African AI solutions. While it hasn’t yet been fine-tuned for widespread deployment, it provides the core building blocks for innovation. Rather than competing with fully developed applications, InkubaLM invites researchers and developers to shape and adapt it to Africa’s linguistic, cultural, and technical realities.

Exit mobile version