EdTech Hub in Conversation with The Brookings Institution: Government Decision-Making Around EdTech Innovations

Over the last four years, EdTech Hub has worked closely with education leaders and researchers to empower people in low- and middle-income countries with evidence for effective technology use in education. In this post, we join the conversation about the demand for evidence in EdTech referencing a recent blog written by Rohan Carter-Rau and Brad Olsen at The Brookings Institution.

Section navigation:

How to improve government decisionmaking around edtech innovations

—Rohan Carter-Rau and Brad Olsen, November 20, 2023

As part of the Research on Scaling the Impact of Innovations in Education (ROSIE) project, we’ve been investigating how government decisionmakers choose education innovations for their countries—and the combination of forces shaping their decisionmaking. Our recent report examined these decision-making processes in low- and middle-income countries (LMICs), particularly around educational technology (edtech).

For the research informing this report, we conducted hour-long interviews with 10 central-government decisionmakers in LMICs and 10 edtech academics, tech industry experts, and representatives from the global education funding community. We also reviewed existing literature and drew on knowledge from our three years of ROSIE research.

We found that when it comes to edtech, most LMIC decisionmakers are already primed to adopt it. They don’t particularly care about evidence on edtech’s feasibility, sustainability, or impact because their motivation for edtech primarily derives from other sources. Additionally, their views regarding which edtech to prioritize appear to be misaligned with what research and many edtech experts view to be the value of edtech innovations in LMICs.

We believe this can change.

What influences policymakers’ edtech decisions?

When it comes to making decisions around edtech innovations, we identified four factors that exert significant pressure for decisionmakers to adopt edtech:

  • In-country pressure for going digital. Country presidents, parents, and other departments within the government often call for edtech because it’s perceived as necessary for a nation to be considered a modern country.
  • Donor priorities. Some donor organizations’ prioritization of edtech creates a financial incentive for policymakers to adopt it (especially given that LMICs have limited education budgets). It also signals to decisionmakers that edtech must be a valuable investment for education: If the funders like it, it must be good.
  • Signaling. The prevalence of edtech in wealthy education systems signals to LMIC decisionmakers that “successful” education systems are filled with edtech, and few national leaders want to be perceived as uncommitted to building a high-quality, 21st century education system.1
  • Tech company marketing. Government decisionmakers are bombarded by marketing from edtech companies hoping to increase market share. Many of these pitches are sophisticated and include information that looks like solid evidence for the innovation, and many decisionmakers lack the training and time to properly evaluate the companies’ claims.
What kind of edtech do decisionmakers consider?

Our interviews revealed that many decisionmakers seem primarily focused only on three categories of (the rather diverse array of) edtech innovations: “shiny edtech,” education management information systems (EMIS), and artificial intelligence. “Shiny edtech” is our term for innovations that look exciting and cutting-edge, regardless of their demonstrated impact. This includes edtech and practices like distributing digital devices to students, adaptive learning software for the classroom, and digital content displays. EMIS innovations, while less “exciting,” have a proven ability to help administrators run school systems more effectively and accurately and can offload some of teachers’ non-teaching tasks—allowing them more instructional time with students. And artificial intelligence-based innovations, as the new kid on the edtech block, currently dominate many government education decisionmaking conversations. When “shiny edtech” receives most of the attention, the myriad other applications of edtech—including proven EMIS—get deprioritized.

The edtech experts from our study expressed caution against “shiny edtech,” noting that these innovations tend to be difficult to scale, expensive to sustain, and have little (or no) firmly demonstrated positive effects on student learning outcomes. They, however, make for good retail politics and are what many tech companies are pitching. EMIS received broad support both from the experts we interviewed and our reviews of the research, while AI-based innovations are too new and disruptive for us to evaluate right now.

How useful is the current research on edtech?

We found that evidence plays a minor role in decisionmakers’ thinking around edtech innovations. Instead, the key factors are (1) who is recommending the edtech innovation, (2) how its deployment will look politically for the government, and (3) the reputation of the innovation or company.

In the rare cases where decisionmakers do want evidence, it’s hard to find, rarely relevant to their situation, and difficult to use. Available research is often outdated (it takes a few years to complete and by then the field has moved on), hard to understand (academic pieces written in technical prose rife with statistical tables), difficult to apply to one’s own specific context, and—in the case of research commissioned by tech companies themselves—not independently reviewed. Some organizations are filling this gap by producing accessible and high-quality research;2 however, the exponential growth of edtech innovations means that finding specific evidence for a single innovation under consideration can be like locating a needle in a haystack.

A framework for making good decisions

Analyzing our data, we saw that most government decisionmakers—intentionally or not—tended to evaluate edtech innovations along three different continua: their motivation to adopt the innovation, the feasibility of implementing the innovation, and the innovation’s potential sustainability for more than a few years. We believe that making this tri-level cognitive calculus visible supports strategic reflection for decisionmakers to more systematically, and with the right evidence, consider and evaluate any proposed innovation for implementation at scale in their jurisdiction. Imagine a three-dimensional plane:

Figure 1. Framework for education decisionmaking
Framework for education decisionmaking

Graphic by Nica Basuel

Each of the three continua is an axis that goes from low to high. The higher up the axis, the more favorable an innovation is. For example, we saw some decisionmakers frame changes in education funding reforms and infrastructure as highly motivated in their jurisdiction and very sustainable but not feasible. Edtech-learning innovations (like digital literacy learning apps) were sometimes framed as motivated, feasible attempts at long-term impact but with resignation that such optimism was overstated because these kinds of reforms cycle through and are therefore not sustainable.

Use of this heuristic (which isn’t limited to edtech innovations—it can be used for any education innovation under consideration) enables decisionmakers and technical advisors to have candid conversations about the potential of individual innovations in their location. It clarifies a sometimes murky decisionmaking process. It also illuminates where additional evidence is needed. The sequence here is to (1) evaluate where a given innovation sits on all three axes, (2) discuss its potential on each continuum, and, if it still seems appropriate to consider the innovation, (3) figure out what additional information would be needed to push the innovation higher up each continuum. If the innovation under consideration does not ultimately sit high on all three continua, it’s probably not the right choice.

We hope use of this framework can support decisionmakers in being clearer about their reasonings and identifying what data they need to make smart decisions around education innovations. We also encourage funders, edtech providers, and scaling implementers to use this heuristic to think about how to move their own preferred edtech innovations further along each continuum. In this way, it’s our motivation that data-informed decisionmaking becomes feasible so that promising innovations are sustainably implemented.

1. And yet, pre-COVID19 in the U.S., 67% of purchased edtech products and software were not being used and an average of 97% were not being used intensively.
2. For example, Central Square Foundation, Education Alliance Finland, and EdTech Hub.

*This blog post originated from The Brookings Institution website. You can find the original post here.*

Thoughts from EdTech Hub – Our view from footnote 2

—Jessica Lowden and Rudolph Ampofo, December 19, 2023

EdTech Hub was referenced in the thought-provoking blog shared above, and after reading this we were inspired to provide a perspective from our unique lens of practice-based research.

Over the course of the last four years, EdTech Hub (the Hub) has worked closely with a range of education decision-makers, stakeholders, implementers, and researchers in service of our mission to empower

people by giving them the evidence they need to make decisions about how to effectively deploy technology in low- and middle-income countries (LMICs).

In these contexts, the Hub has encountered a substantial appetite from policymakers and practitioners for research relating to education technology (EdTech, sometimes referred to as ICT). This viewpoint is based on the Hub’s work in its seven focus countries — Bangladesh, Ghana, Kenya, Malawi, Pakistan, Sierra Leone, and Tanzania. But is also grounded in the experience of the Hub’s global Helpdesk offering, which has responded to 184 requests in 46 additional countries.

Recently, the Hub’s work was included in a thought-provoking piece from the Brookings Center for Universal Education (CUE). We appreciate the acknowledgement in footnote two of the blog post How to improve government decisionmaking around edtech innovations that the Hub is one of the few organizations outside of tech companies conducting practice-based research. We value these opportunities to create conversations on the topics that drive not only our work in EdTech, but also the work of so many others across the field. We’d like to use this opportunity to provide an additional perspective to the work presented by CUE and invite dialogue from other researchers and practitioners.

In our work, we’ve found that decision-makers are eager for both broader and deeper evidence to inform their decisions around technology and its uses, not just in the classroom, but in a range of areas across the education sector.  However, similar to the wide range of political and economic backgrounds found in each LMIC, each country is in a unique place on its timeline of developing, adopting, and implementing EdTech strategies; each of which was uniquely impacted by the effects of the Covid-19 pandemic. This impacts each country’s decision-makers’ perspectives, abilities, desires, and incentives to incorporate data-driven, practice-based research.

Even though ministries have had to react to digitization pressure due to Covid and, in some cases, natural disasters, the Hub continues to see decision-makers’ willingness to work towards incorporating evidence into their implementation decisions, which sometimes involves rethinking original decisions on EdTech interventions. Beyond embedding evidence at the heart of decision-making, we see ministry partners pushing for driving more awareness on the use of evidence and building an evidence-informed culture on EdTech decisions. In Sierra Leone, for example, EdTech Hub is working with donor partners and the Ministry of Basic and Senior Secondary Education (MBSSE) to leverage evidence to improve teacher placements and school quality assurance. In Ghana, through the National Education Reform Secretariat (NERS), we see the Ministry adopting evidence in education data dashboards to develop a unified data visualization system that will increase accountability on education delivery and inform decision-making within the ministry. The EdTech Hub is seeing countries like Kenya, Tanzania, and Malawi undertake similar efforts of using evidence to inform EdTech deployment to improve learning outcomes.

We agree that there are many factors pressuring decision-makers when it comes to EdTech; however, we have also found that there is a wide range of decision-makers, influencers, and users and this entire ecosystem of actors is advocates for the critical use of data. Civil servants within ministries are most often the ones tasked with informing the development of policies and then implementing the broad recommendations reached by consensus in the final policy document. The civil servants, regional and local officials, as well as school principals are in positions to advocate for budget concerns, influence program design and collaborate with implementing partners. In Sierra Leone, Malawi, and Ghana, we see dedicated efforts to incorporate evidence that impacts the use of data to make informed decisions on teacher allocation, education program implementation, and scale decisions.

Many of the requests for data, information, and support around EdTech coming into the Helpdesk, which provides just-in-time support to discrete requests, are around how to operationalize digital strategies. In numerous contexts, the Hub’s Helpdesk has worked with government officials and decision-makers on how to operationalize digitalization and EdTech plans via in-depth workshops and working sessions. These officials are hungry for evidence-based solutions they can apply in their countries. During these sessions, we’re not only able to discuss multi-modal EdTech approaches that recognize “shiny” or high-tech solutions but also that there is a wide range of middle-, low-, and even no-tech solutions that have proven to be highly effective in achieving results and which are often more aligned with the needs of LMICs.

As researchers, it’s important that we distill our data for this range of audiences. At the Hub, this often means that in addition to technical research papers, we’re also developing accompanying summaries, blog posts, and importantly, tailoring presentations for in-country stakeholders and decision-makers to distill, socialize, and discuss complex research results. We also conduct targeted “rapid evidence reviews” to address the need for real-time data exploring impact.

In sum, the Hub continues to see a strong desire for evidence around EdTech. There are myriad roadblocks to obtaining relevant data at the right point in planning and decision-making cycles, but ultimately, we’ve seen that educators, stakeholders, and decision-makers want EdTech solutions that are proven and relevant to their contexts. And they want evidence to help them make the case for solutions that can have the greatest impact. It’s important that we, as the EdTech research community, help facilitate and improve easier access to information.

What can we do as the EdTech research community to better equip decision-makers?
  • Find a balance between in-depth research and timeliness — while long-term analyses have their place, when it comes to rapidly evolving sectors like EdTech, it’s also necessary to recognize the need for rapid responses;
  • Encourage research on a wide range of EdTech solutions that are relevant to LMIC contexts;
  • Ensure that research is usable — work with decision-makers to provide relevant summaries, visuals, and guides;
  • In addition to data, ensure decision-makers have access to frameworks which help them review data and project analyses and ensure that uptake does not have unintentionally harmful outcomes; (Please see another of our blog posts How EdTech can be used to Help Address the Global Learning Crisis and our five guiding questions for decision-makers)
  • Incorporate a wide range of stakeholders and decision-makers in research decisions —co create research plans not just with donors and high-level officials, but include a range of participants in the design methodology; and
  • Don’t conduct research in a bubble — help develop a new generation of in-country researchers by partnering with local organizations and experts at all stages of the research.

These suggestions are merely a starting point for the ongoing dialogue needed between researchers, practitioners, and decision-makers. We invite you to join us in these conversations and help improve and equip people to make sustainable, evidence-based decisions that impact learners and communities.

Connect with Us

Get a regular round-up of the latest in clear evidence, better decisions, and more learning in EdTech.

Connect with Us​

Get a regular round-up of the latest in clear evidence, better decisions, and more learning in EdTech.

EdTech Hub is supported by

The findings, interpretations, and conclusions expressed in the content on this site do not necessarily reflect the views of The UK government, Bill & Melinda Gates foundation or the World Bank, the Executive Directors of the World Bank, or the governments they represent.

EDTECH HUB 2024. Creative Commons Attribution 4.0 International License.

to top