Algorithmic design in EdTech: Investigating adaptivity for learners and teachers in a digital personalised learning tool in Kenya

Image showing EdTech Hub experts engaging with a digital personalised learning tool in Kenya

Are you a policymaker, EdTech developer, educator, or parent? Are you interested in how technology and personalised learning can equitably help enhance student engagement and learning outcomes? Do you think about how the data collected by digital tools can best impact learners’ progress? These are some of the questions we have been asking at EdTech Hub. Read on to find out about our work and what we are discovering. 

A holistic, multi-strand approach to researching digital personalised learning 

EdTech Hub has been collaborating for the past two years with the educational platform developer EIDU on a Hub-Led Research project to investigate the impact of digital personalised learning (DPL) on literacy and numeracy outcomes for pre-primary learners in Kenya. The project is part of a larger field examining the potential impacts of DPL on learning outcomes in low- and middle-income countries (LMICs).

So far, the study has examined and continues to report on the pedagogical integration of EIDU’s classroom-based DPL tool and how to increase digital device usage equity using design-based research, randomised-control trials, and sandbox co-design methods. The multi-strand nature of this study enables it to explore the various components of classroom-integrated DPL holistically, deepening evidence around how this form of EdTech can be sustainably embedded in education systems.

The most recent strand of our study, funded by the Gates Foundation, goes beyond classroom implementation to focus on the software component and the adaptive nature of DPL: namely, algorithm training and backend design in enhancing learning outcomes. We explore the ‘black box’ of data generated by the EIDU DPL tool, that is the continuously changing decisions that the tool makes to personalise learning based on ever-evolving factors, particularly data from the digital assessments that learners regularly engage with. 

Our main questions are: How can we enhance the personalisation of learning content using data from digital assessments? How can data on student performance within an adaptive DPL tool inform teachers’ lesson planning and instruction?

How are adaptive DPL tools designed to improve learning outcomes? 

Adaptive DPL tools are technologies that personalise learning for diverse learners. They use algorithm training techniques to analyse real-time assessment data and adjust in-time learning pathways and instructional pace based on individual behaviours, prior knowledge, time spent on a task, and prior assessment scores

These tools can categorise learners based on their comparative pace of learning for a particular curriculum competency (i.e., mastery levels) and detect levels of engagement. Research in well-resourced educational contexts suggests that DPL tools can improve learning gains in maths at pre-primary and primary levels. While there is less overall research on LMICs, a meta-analysis of the available evidence found that DPL tools with adaptive personalised strategies had a statistically significant effect on learning gains. 

To generate evidence about how backend design and algorithm training using data can enhance learning outcomes, we need to understand and categorise different DPL tools. To this end, our team adopted an ‘adaptivity framework’ developed by Van Schoors (2021), which consists of four questions covering what the tools adapt to, what they adapt, when they adapt, and how they adapt. Based on different taxonomies in the literature, our team added a fifth question borrowed from Bernacki et al. (2021), that is: For what educational purpose do the tools adapt? In answering these questions, it is possible to identify how different DPL tools are designed to consider factors such as a student’s prior knowledge, scores, motivation levels, the best ways to sequence the content presented to the learner, and how to integrate feedback to and from both learners and teachers (See also: Holstein et al., 2020; Kabudi et al., 2020 Plass & Pawar, 2020; Tetzlaff et al., 2021).  

Exploring the kind of real-time data that is continuously generated by adaptive DPL tools — i.e., learning analytics — and how this data feeds into the tool design and personalisation is also important. Using this data, adaptive DPL tools make predictions of a student’s success in certain learning domains and determine the kind of scaffolding activities needed for a student to master a particular competency.

Engaging with the adaptivity framework and how adaptive DPL models are technically built can help us better understand DPL tools. The framework might also enable policymakers and educators to select tools that align with their governments’ priorities and guidance on pedagogical practice to enhance learning outcomes, student engagement and facilitate collaborative learning. ​​

To improve DPL in LMICs, more collaborative research is needed to increase transparency about algorithmic design, improve methods for bias removal, and include learners, instructors, and policymakers in the interpretation of and usage of learning analytics. Furthermore, collaboration between education stakeholders, researchers, and developers can enhance the design of algorithms that are pedagogically informed.

An example of adaptive DPL in pre-primary Kenyan classrooms. This research focuses on EIDU: an example of an adaptive DPL tool in Kenya, comprising an application with both a teacher-facing and learner-facing interface for early grade teaching and learning. This application is pre-installed on a low-cost Android device, with one to two devices distributed per classroom. As of January 2024, some 225,000 active learners from 4,000 pre-primary schools are enrolled in the tool. Teachers can create unique profiles for their learners, who use the tool daily for an average of five minutes each. The app provides digital learning units and evaluation exercises focusing on literacy and numeracy, organised into strands and sub-strands, and aligned with the Kenyan competency-based curriculum. The main purpose (i.e., why) of EIDU’s personalisation is to enhance learning outcomes by customising content sequencing, as opposed to customising the content itself.

So, how are we investigating algorithmic design and data adaptivity for teachers and learners at EdTech Hub? 

Our main research question for this new strand of work is: What effect do various algorithmic personalisation strategies have on pre-primary learners’ literacy and numeracy learning outcomes? The study involves co-designing with EIDU and testing software interventions  — that is, different pedagogically informed algorithmic formulas at the backend of the EIDU tool. This is followed by analysing the emerging data and assessing the effects of such interventions on learning and engagement. 

While the interventions are entirely software-based, they build on two years of prior collaborative field research with teachers using EIDU, as well as literature reviews within the fields of digital pedagogy, artificial intelligence in education, and adaptive designs.

Specifically, we focus on two main areas of research: 

  • Optimising DPL for learners: How to improve the development of learner personalisation (i.e., adaptivity) using data from digital assessments.
  • Optimising DPL for teachers: How to meaningfully communicate data from the DPL tool to help inform teachers’ decisions around lesson planning, instruction, and equitable distribution of digital devices among learners in the classroom.

The first of these main areas — optimising DPL for learners comprises co-designing four software interventions with EIDU to improve the algorithmic design of the tool. Referring to the adaptivity framework mentioned earlier, these interventions specifically focus on how the tool adapts and what it adapts to. These areas of adaptivity were identified as the leverage points for the highest possible impact on learning outcomes in EIDU’s case. These software interventions include: 

  • Modifying the duration of learner practice sessions: Initially, this was set at 5 minutes per learner. Interventions included experimenting with durations of 7 ​​and 9 minutes to determine optimal usage in a classroom setting. 
  • Providing learners with choices to select their next activity: Initially, a learner was automatically moved to the next activity chosen by the tool. Interventions included the option of choosing between three suggested activities to look at the impact of learner agency on student performance. 
  • Expanding the variety of factors for personalising learning content: The algorithm considers the next activity either to maximise scores or engagement, based on the learner’s history, instead of fixed pedagogical sequences.
  • Offering the option of repeating previously completed learning exercises: To assess whether it helps to master curriculum competencies. 

Through the above interventions and subsequent data analysis, we aim to provide practical suggestions for algorithmic design. This will enable the algorithm to suggest the most suitable activity for correct answers to the learner, thereby enhancing the learner’s overall learning experience.

In terms of optimising DPL for teachers, we co-designed and implemented five software interventions to support different teaching processes, including lesson planning, classroom management, and instruction informed by assessment. The interventions included:

  • Teacher timer for lesson planning: Providing a timer to help teachers track their lesson teaching time and align it with government guidelines. 
  • Teacher choice of learners’ practice session: Offering suggestions for teachers to select suitable learning practices for individual learners based on data from learners’ performance histories and Kenya’s structured pedagogy programme.
  • Teacher participation in enhancing digital use equity: Facilitating information about learners’ device usage time, enabling teachers to choose learners to hand the device to for practice. 
  • Teacher dashboards for instruction informed by assessment data: Providing teachers with data on learners’ performance and progress through digital learning units, while categorising learners into levels of competencies outlined in the curriculum. 
  • Teacher prompting and notification system: Reminding teachers to view dashboards which inform teachers of student performance.

Through the above-mentioned interventions, this study represents an innovative collaboration between EdTech Hub education researchers and EIDU’s developers and data scientists. It sheds light on how algorithm training, backend design, and data interpretation affect DPL implementation and impact, suggested as a need for research by Van Schoors et al. (2021) in their systematic review of 25 years of DPL. EdTech Hub’s study aims to address current research gaps by enhancing our understanding of DPL, adaptive learning design, and artificial intelligence in education. It focuses on how to develop personalised learning systems, redesign pedagogically informed algorithms, and meaningfully communicate data back to teachers for an optimal learner experience through a DPL tool, with a particular emphasis on the impact on learning outcomes, especially in LMICs and pre-primary education.  

We are currently working to analyse the data and will share the results in digestible knowledge packs that cover different areas of our interventions. These packs will be available for different audiences, including policymakers, EdTech developers, researchers, and educators. We will also be presenting at relevant academic conferences to continue the conversation with academic colleagues. Stay tuned as we communicate the results of this research on our webpage, and reach out to us at @GlobalEdTechHub for more insights, questions, or to discuss collaboration. 

Sources referenced 

The following list of references includes details of work referred to directly or for which links are embedded in this blog. 

Alrawashdeh, G. S. (2022). Approaches to technology-enabled personalized and adaptive learning: A brief overview. OSF Preprints. 

Basham, J. D., Hall, T. E., Carter, R. A., & Stahl, W. M. (2016). An operationalized understanding of personalized learning. Journal of Special Education Technology, 31(3), 126–136.

Bearman, M., & Ajjawi, R. (2023). Learning to work with the black box: Pedagogy for a world with artificial intelligence. British Journal of Educational Technology, 54, 1160–1173. 

Bernacki, M.L., Greene, M.J. & Lobczowski, N.G. (2021). A systematic review of research on personalized learning: Personalized by whom, to what, how, and for what purpose(s)? Educational Psychology Review 33, 1675–1715. 

Holstein, K., Aleven, V., Rummel, N. (2020). A Conceptual Framework for Human–AI Hybrid Adaptivity in Education. In: Bittencourt, I., Cukurova, M., Muldner, K., Luckin, R., Millán, E. (eds) Artificial Intelligence in Education. AIED 2020. Lecture Notes in Computer Science(), 12163. Springer, Cham. 

Kabudi, T., Pappas, I., Olsen, D. (2021). AI-enabled adaptive learning systems: A systematic mapping of the literature, Computers and Education: Artificial Intelligence, 2, 100017, ISSN 2666-920X,

Major, L., Francis, G. A., & Tsapali, M. (2021). The effectiveness of technology-supported personalised learning in low- and middle-income countries: A meta-analysis. British Journal of Educational Technology, 52, 1935–1964. 

Plass, J. L. & Pawar, S. (2020). Toward a taxonomy of adaptivity for learning, Journal of Research on Technology in Education, 52:3, 275-300, DOI: 10.1080/15391523.2020.1719943

Tetzlaff, L., Schmiedek, F. & Brod, G. (2021). Developing personalized education: A dynamic framework. Educational Psychology Review 33, 863–882. 

Thai, K., Bang, H. J., & Linlin, L. (2022). Accelerating early math learning with research-based personalized learning games: A cluster randomized controlled trial, Journal of Research on Educational Effectiveness, 15:1, 28–51, DOI: 10.1080/19345747.2021.1969710

UNICEF. (2022, May 26). Trends in Digital Personalized Learning | UNICEF Office of Global Insight & Policy. Retrieved 29 Jan 2024

Van Schoors, R., Elen, J., Raes, A., & Depaepe, F. (2021). An overview of 25 years of research on digital personalised learning in primary and secondary education: A systematic review of conceptual and methodological trends. British Journal of Educational Technology, 52, 1798–1822.

Vandewaetere, M., Clarebout, G. (2014). Advanced Technologies for Personalized Learning, Instruction, and Performance. In: Spector, J., Merrill, M., Elen, J., Bishop, M. (eds) Handbook of Research on Educational Communications and Technology. Springer, New York, NY. 

Wang, S., Christensen, C., Cui W., Tong, R., Yarnall, L., Shear, L. & Feng, M. (2023). When adaptive learning is effective learning: Comparison of an adaptive learning system to teacher-led instruction, Interactive Learning Environments, 31:2, 793–803, DOI: 10.1080/10494820.2020.1808794

Connect with Us

Get a regular round-up of the latest in clear evidence, better decisions, and more learning in EdTech.

Connect with Us​

Get a regular round-up of the latest in clear evidence, better decisions, and more learning in EdTech.

EdTech Hub is supported by

The findings, interpretations, and conclusions expressed in the content on this site do not necessarily reflect the views of The UK government, Bill & Melinda Gates foundation or the World Bank, the Executive Directors of the World Bank, or the governments they represent.

EDTECH HUB 2024. Creative Commons Attribution 4.0 International License.

to top