Measuring learning outcomes across programmes focused on learners, teachers and systems

Measuring student learning in a meaningful way is a significant challenge within a single school, but measuring and comparing learning outcomes at a global scale can seem impossible. This is a problem for anyone in a position of choosing between different options for maximising the impact of their spending on education because it is difficult to make a data-driven decision. It also affects priorities for policy and implementation of education interventions, especially where resources are limited.

In the last five years, the development and increasing prominence of the metric “Learning-Adjusted Years of Schooling” or LAYS (Filmer et al., 2018) have provided a potential solution. EdTech data is at the forefront of the approach. LAYS seeks to tackle the very difficult task of harmonising learning engagement with attainment according to universal, standardised tests. As a result, the expected learning outcomes from a year of school in one country can be compared to that of another country or to learning outcomes measured by a standardised test. This metric relies on a number of assumptions and the quality of data of the underlying tests, so while it is not perfect, it does provide a useful tool for comparing different ways of measuring learners.

However, learning outcomes are not always focused on measuring what learners are doing – many of the implementations that are the focus of EdTech Hub research activities are also focused on teachers and learning systems. The LAYS concept has been extended to measuring the learning outcomes from these interventions by correlating outcomes for teachers and systems to learning engagement and attainment (Angrist et al., 2020).

To illustrate this, imagine a teacher professional development initiative that results in a 10% increase in lessons that are prepared with a structured pedagogical approach, which in turn improves student learning attainment by 10%. This could be expected, for example, to lead to an average 1% increase in LAYS per student. However, because of the scale of the TPD intervention, every teacher affects all the students in their classroom, so the impact is multiplied. 

Now imagine the scale of systems-focused interventions, where for example, a new data system might decrease the administrative burden on teachers by 15%, which increases lesson preparation time by 5% and improves teacher retention by 1.5%. To measure the learning outcomes of such an intervention, one must measure the impact of both increased lesson preparation and teacher retention on learning outcomes. 

While there is some research linking increased lesson preparation to improved learning outcomes, more evidence is required to understand and articulate the link between teacher retention (and, by extension, teacher experience) and learning outcomes. At the moment, it is difficult to build a reliable model that can do this effectively. In all these cases, high-quality, rigorous research that is localised in different contexts is valuable in extending an understanding of what can and cannot be measured and compared with LAYS and where the variance or uncertainty is too great to be meaningful.

A significant meta-analysis that has compared LAYS of different types of interventions is Angrist et al.’s 2020 paper ‘How to Improve Education Outcomes Most Efficiently? A Comparison of 150 Interventions Using the New Learning-Adjusted Years of Schooling Metric.’ As part of this analysis, the authors aggregated categories of interventions that allow some general comparisons — with the caveat that some categories have very high variance, such as the categories of “merit-based scholarships” and “structured lesson plans”, as illustrated in the figure below.

Source: Angrist et al., 2020: p. 15

This is valuable analysis, and it also demonstrates how, within the context of measuring LAYS for interventions focused on teachers and systems, a more granular approach would provide additional value. For example, within the model, “General-skills teacher training (in-service)” is shown to have no discernible impact on LAYS. However, “Teacher accountability and incentive reforms” show a 16% increase in LAYS. This categorisation is, of course, useful, but to extend this to the evaluation of the effectiveness of specific interventions, further clarity is required on the conditions and specific data necessary to match such programme outcomes, like changes in teacher behaviour, to learning outcomes. 

This is an important endeavour that many within the global education community are seeking to make progress on, including the World Bank, USAID, and Brookings. Within our work at EdTech Hub, we are contributing to this by analysing the cost-effectiveness of the implementations within our research portfolio, and the data we are generating can help to expand the picture of what interventions can be compared with LAYS, even when they are not directly engaging learners. We are working on a technical paper that outlines a calculation for extrapolating an estimate for LAYS of educational interventions which do not directly measure learning outcomes, supported by the data from the EdTech Hub research portfolio and a range of related studies. We trust that this will provide a helpful contribution to improving the comparison of cost-effectiveness estimates across a broader range of intervention classes than has previously been possible. 

As mentioned above, there is a lot of related work which informs our thinking and practice on cost-effectiveness analysis in EdTech. The following three resources have been particularly thought-provoking and are worth exploring to find out more: 

Connect with Us

Get a regular round-up of the latest in clear evidence, better decisions, and more learning in EdTech.

Connect with Us​

Get a regular round-up of the latest in clear evidence, better decisions, and more learning in EdTech.

EdTech Hub is supported by

The findings, interpretations, and conclusions expressed in the content on this site do not necessarily reflect the views of The UK government, Bill & Melinda Gates foundation or the World Bank, the Executive Directors of the World Bank, or the governments they represent.

EDTECH HUB 2023. Creative Commons Attribution 4.0 International License.

to top