Blog

How to unlock learning outcomes in the age of AI?

Scanning the horizon

The rise of Artificial Intelligence (AI) presents both immense opportunities and significant risks for education. Without intentional action, AI could widen the learning divide rather than close it. But what if we could change that trajectory?

Trends and truths about AI

The current wave of AI tools and investments in education presents both exciting opportunities and critical challenges. While some AI-driven solutions are making strides in improving learning outcomes, there remains a gap between these innovations and the actual needs of teachers and learners. Through EdTech Hub’s AI Observatory’s initial horizon scan, we engaged in multiple conversations that highlighted key recurring themes with one of the most pressing being the role of teachers in shaping and utilising AI tools. Too often, teachers are handed AI solutions to integrate into their practice without sufficient training or agency in their development. A significant opportunity lies in flipping this power dynamic, ensuring that AI tools are designed with teachers, not just for them.

A thought-provoking suggestion from the webinar was to map out a ‘day in the life’ of a teacher, identifying every challenge they face and using those insights to guide AI development in a way that genuinely addresses their needs. Rather than designing tools in isolation, the sector must focus on surfacing teacher needs and ensuring that AI solutions serve as meaningful supports rather than additional burdens. In essence, we need to make sure that ‘the hammers are looking for the right nails’ by prioritising practical, teacher-centered approaches in AI development for education.

We’re seeing compelling use cases of AI in marginalised communities, reinforcing a crucial point about where AI tools currently stand. One striking example came from Taleemabad, where even in off-grid, ultra-rural areas, people are climbing to the highest points just to access the internet and make use of AI-powered educational tools. This highlights an important reality; some technological shifts carry an unstoppable momentum, reaching even the most unlikely environments. A reality which demonstrates the urgent need for evidence and equity-based approaches. 

We have also seen similar trends with tools like EIDU and the Warrior Chatbot in Sierra Leone; all of which are actively being used in the communities they were built for. The AI Observatory through AIforEducation.org has documented over 300 AI-driven education solutions from around the world, showcasing how AI is being adapted to diverse local needs. These examples reinforce the idea that AI’s role in education is not just about developing cutting-edge technology but about meeting people where they are ensuring that AI tools are designed for real-world challenges in the communities that need them most.

While much of the conversation around AI in education is rightly focused on opportunity, on promising use cases and innovation, it’s equally important to acknowledge the structural blind spots and risks that aren’t being talked about enough. While assessing what’s inevitable and what can be shaped we are reminded that just because a trend feels unstoppable doesn’t mean it should go unquestioned. There’s a real danger in assuming inevitability, especially when the foundations of these technologies, like the data they’re trained on and the models that shape them, are often developed far from the contexts where they’re ultimately deployed.

Many of these AI models are being trained on data from the Global North, with little representation from regions like Africa, and even less in local languages or culturally relevant contexts. This creates a disconnect between where AI tools are built and where they are used. Without thoughtful mechanisms for local data sharing, ones that are safe, consent-based, and context-sensitive, we risk reinforcing a one-way flow of influence, where AI tools are shaping learning environments without being informed by them.

There’s also a lot of hype around AI right now, and while excitement can be a driver of innovation, it can also mask the gaps in evidence. We hear a lot about AI reducing teachers’ administrative burdens, but in reality, the opposite could occur. For example, chatbot models intended to support teachers may actually add to their workload, requiring oversight, monitoring, and additional admin, ultimately pulling them away from time spent with students.

Other risks are emerging around the quality and type of educational content being generated. While generative AI holds promise for rapidly creating learning materials, we’ve seen poor-quality outputs from open platforms flood educational spaces. This risks crowding out legitimate, pedagogically sound content. Similarly, the use of predictive AI can help drive efficiency, but when it leads to over-generalisations, it can be harmful to nuanced, learner-centered instruction.

Finally, the growing interest in AI for personalised learning must also be viewed with caution. Tools that track learning gaps can be powerful, but those same technologies can be used to collect sensitive data on opinions, beliefs, even speech. In the wrong hands, this can move from supporting learners to surveilling them, raising deep ethical concerns. 

However, these aren’t reasons to stop exploring AI in education, but they are strong reminders that we need to build critically, intentionally, and inclusively. 

Building for a more equitable future

EdTech Hub’s work is centered around decision-makers, especially within ministries of education. Lea Simpson, the Innovation Director at EdTech Hub leaves the decision makers with one call; moving with intentionality! As much as this sounds simple, in a time of rapid technological change, being intentional about what you invest in and the choices you make is more difficult than ever. The pace of AI innovation can make it feel like you’re constantly trying to catch up.

“But instead of being stuck in that reactive mode, I’d encourage ministers and policymakers to pause and ask: What future are we actually trying to design towards,” said Lea Simpson, “What is our vision for education in our country five or ten years from now, in the age of AI? And how do we work backward from that vision to make informed, strategic decisions today?”

This kind of forward-thinking approach could help ensure that stakeholders in education are not just chasing trends but building systems that are resilient, inclusive, and aligned with long-term learning outcomes.

As we brought the conversation to a close, we turned the mic to the plenary and asked: What topics in AI are most neglected? What should we avoid as we move forward? The reflections were powerful. Nariman Moustafa challenged us to look beyond the excitement of new tools and instead reckon with the legacy we’re building upon. After more than two centuries of modern schooling, with its undeniable progress but also deep inequities, there is a need to reimagine learning itself. AI offers us more than just upgrades; it offers a chance to rethink how, why, and for whom education is designed.

Another resounding theme was the central role of teachers. In many education systems, especially in the Global South, learners may move in and out, but teachers stay, often for decades. For any AI integration to succeed sustainably, we must prioritise teacher development and support. We must alleviate the burden of ever-evolving tech and ensure educators have the tools and time to adapt meaningfully.

And finally, a word of caution: we heard from practitioners working in countries like Bangladesh and elsewhere about the risks of over-relying on adaptive, personalised tools without proper grounding in the local context. In many places, it’s teachers—not algorithms—who are doing the real personalisation, especially when materials don’t reflect the language or lived experience of the learners.

Reflections from this insightful conversation remind us that AI in education must be more than a technological fix. It must be a human endeavor—deeply intentional, context-aware, and driven by equity.

Related resources on AI for education

Connect with Us

Get a regular round-up of the latest in clear evidence, better decisions, and more learning in EdTech.

Connect with Us​

Get a regular round-up of the latest in clear evidence, better decisions, and more learning in EdTech.

EdTech Hub is supported by

The findings, interpretations, and conclusions expressed in the content on this site do not necessarily reflect the views of The UK government, Bill & Melinda Gates foundation or the World Bank, the Executive Directors of the World Bank, or the governments they represent.

EDTECH HUB 2025. Creative Commons Attribution 4.0 International License.

to top