Enhancing the LMIC Evidence Ecosystem: Government Research Capacity
Reviewers: Tom Kaye, Laila Friese
To optimise decision-making, government stakeholders are moving towards even more effective use of data systems by both developing and applying rigorous research within a harmonised evidence ecosystem. In this blog, we explore how evidence is currently being produced, communicated, and used in Bangladesh, Kenya, and Sierra Leone. These three countries, with varying degrees of centralisation in education, represent diverse operating environments that shape evidence generation and uptake. We asked government stakeholders about the challenges they face in research and how they strive to overcome them. Based on themes identified through stakeholder interviews, we identify ways governments and development partners can enhance the evidence ecosystem of low- and middle-income countries (LMICs).
As the Covid-19 pandemic forced governments to develop and implement new, rapidly responsive education programmes, governments globally have recognised the importance of having the in-house capacity to undertake timely research. Throughout this period, substantial investments have been made to strengthen education data systems to measure, for instance, the impact of distance learning on marginalised learners or the impact and reach of EdTech interventions and accountability at the school level (Pellini et. al., 2021). But even high-quality data cannot be maximised to support decision-making without an effective evidence ecosystem. To quote Varun Banka, Pulse Lab Jakarta (2014),
“Data [and evidence] does not automatically translate into better policymaking processes, but when it is interpreted, analyzed, and critically discussed, it can help make decisions smarter, more transparent and more open.”
The evidence ecosystem
Pellini et. al (2021) define an ‘evidence ecosystem’ as an array of actors including the government, private sector, and civil society organisations that provide and / or demand evidence to support the development and implementation of public policies. This includes evidence producers, evidence intermediaries, and evidence users.
According to this framework, the operating environment impacts the very nature of the evidence ecosystem. Even in high-income countries with strong systems of research and development, there is no single globally endorsed approach. For example, in the United Kingdom, the education evidence ecosystem is fairly fragmented but sustained through partnerships. Meanwhile, in Singapore, it is highly centralised with the Ministry of Education partnering directly with semi-autonomous agencies like the National Institute of Education.
Despite the varying types of interactions and interdependencies between stakeholders, there are some common themes that shape evidence ecosystems. What are the different operating environments from which we can draw stakeholder experience? Below we explore how evidence is currently being produced, communicated, and used in Bangladesh, Kenya, and Sierra Leone.
Research and innovation are at the heart of Bangladesh’s Vision 2041 and educational reform is one of its main drivers. According to government stakeholders, the upcoming Blended Education Master Plan supports this vision by providing an opportunity to improve how education evidence is shared, communicated, and used (Government of Bangladesh, 2021; University Grants Commission, 2021). This includes potential plans to develop a research network through Blended Education Research and Development, Innovation and Evidence (BERDIE) units. Currently, there is an abundance of education data in Bangladesh. Most education data is managed by the Bangladesh Bureau of Educational Statistics (BANBEIS), while EMIS data is stored within the Directorate of Secondary and Higher Education (DSHE) and the Directorate of Primary Education (DPE). However, at present, most research is conducted by development partners or commissioned to private sector research firms.
In Kenya, during Covid-19-related school closures, the Ministry of Education used data to support a transition into what one stakeholder referred to as “intervention mode”. To streamline education research to support interventions, some government agencies test new interventions while others conduct impact evaluations. At present, the education evidence ecosystem in Kenya is highly fragmented but sustained through partnerships across government agencies and with some academic institutions. The Ministry of Education’s upcoming Policy on Information and Communication Technology in Education and Training has the potential to strengthen evidence production, communication, and usage through partnerships (Ministry of Education, forthcoming).
To drive the Free Quality School Education (FQSE) programme in 2018, Sierra Leone’s Ministry of Basic and Senior Secondary Education (MBSSE) and the Teaching Service Commission (TSC) have invested heavily in using evidence to improve education (McBurnie & Beoku-Betts, 2022). Much progress has been achieved through partnerships to harmonise data from the Annual School Census (Fab Inc, 2021). To further support data-driven decisions, The Education Sector Plan 2022–2026 emphasises the strategic importance of “strengthened partnerships with the private sector, civil society and donor partners” (Government of Sierra Leone, 2022, p. v). At present, development partners and the Minister of Education are the main drivers of demand for education research.
Despite the abundance of data, there are several challenges that prevent data from being used to generate meaningful evidence through research. Currently, the scope of evidence production in the governments interviewed appears to be limited to collecting and storing data, while most meaningful analysis is generated through partnerships of different kinds.
A growing body of research suggests that the very way that data is accessed and shared can impact the formation of research partnerships and, as a result, the production of meaningful evidence (Rossiter, 2020). In Kenya, education data can be accessed through formal requests, which can impact the timely production of evidence:
“There is an issue of intellectual property and bureaucracy. If a government organisation or university wants data, a formal letter has been sent to the Director” (Government stakeholder, Kenya).
Stakeholders in Kenya expect that stronger inter-agency partnerships could potentially open more avenues for data sharing and, as a result, enhance evidence production. In contrast, education data in Bangladesh is openly accessible. But because BANBEIS’s mandate is to collect and manage education data, BANBEIS commissions research to private sector research firms, academics, and development partners.
Across all three countries, rigorous research is developed through partnerships but is often unable to meet timely demands for evidence. To generate rapid evidence, governments often conduct their own analysis. This can lead to several issues in quality. For example, in Sierra Leone, where the World Bank heavily supports the process of evidence production, when the Government conducts its own analysis:
“It isn’t really analysis. It’s just tabulation and trends that don’t get to the root causes of issues and [so] the policy is not targeted as a result” (Government stakeholder, MBSSE, Sierra Leone).
A stakeholder from the Teaching Service Commission (TSC) believes that having a trained research team in-house would help build on the quality of evidence:
“There is a need to have a solid research team with more manpower and capacity for research. The absence of this creates data gaps, making it difficult to produce evidence” (Government stakeholder, TSC, Sierra Leone).
In this sense, a reliance on partnerships can be akin to using plasters on gaps in evidence production. While rigorous research does not need to be produced in-house, government stakeholders should be equipped to produce meaningful and rapid analysis. For this reason, Bangladesh’s Ministry of Education aims to provide more sustainable approaches to evidence generation by addressing research capacity challenges within the Ministry and the higher education system as a whole.
Translating evidence into actionable recommendations requires actors to communicate it effectively; however, interviews with stakeholders highlight challenges in differentiating evidence from data. Could it be that ineffective mechanisms in producing evidence can impede the way evidence is communicated?
As Bangladesh focuses on improving stakeholder research capacity to reform these channels, the Ministry of Education plans to “develop, institutionalise and mainstream” some research under the BERDIE network via an information library to allow decision-makers access to evidence (stakeholder from A2i Bangladesh). However, as stakeholders in Sierra Leone emphasise, even quality research must be effectively contextualised to be valuable:
“They will send an email: ‘Here is our research’. But there is no ‘So what?’ Even when development partners conduct research, there is little engagement about what it actually means” (Government stakeholder, MBSSE, Sierra Leone).
In Kenya, engagement with evidence appears to be supported through advocacy campaigns. For example, a stakeholder from Kenya found advocacy campaigns led by development partners, i.e. UNICEF, to have helped develop an environment where evidence was not only shared but engaged with and responded to, through interventions.
Many stakeholders described the Covid-19 pandemic as having driven demand for evidence about distance learning interventions and learning loss. This helped foster coherence between evidence production and usage. A stakeholder in Kenya claims that this made it easy to see how evidence was being used to make decisions:
“During Covid, we knew our findings were being used. But now, we have no way of finding out whether the Ministry of Education is making use of your research. You know that you are sharing it in conferences, in workshops. But like musicians, we don’t know who is listening to our music” (Government stakeholder, Kenya).
The fact that evidence producers across different countries are uncertain as to whether their reports are being used raises the question: Are the research demands of decision-makers being articulated? In Sierra Leone, a stakeholder mentioned that “it is seldom” for actors besides the Minister of Education or development partners to request research. With few demands for evidence made by evidence users, examples from MBSSE show that evidence producers invest time in identifying areas of research. What can help stakeholders across LMICs clearly ask for evidence?
“Decision-makers must be oriented to understand what research should be conducted to support their decisions. Essentially, the basics of what a research question is” (Government stakeholder, DSHE, Bangladesh).
Capacity-building around the fundamentals of research could help ensure that the evidence produced is used to connect policymakers closely to educational challenges on the ground. Decision-makers’ uncertainty of what research can be used to scaffold policy could be straining the coherence of the entire evidence ecosystem.
What can we learn from stakeholders?
During the Covid-19 pandemic, actors across different areas of the evidence ecosystem worked together through partnerships to ensure data shaped education decision-making. To continue this momentum, our interviews with stakeholders suggest that demands for evidence can help shape effective evidence production and increase engagement when evidence is being communicated. How can this be supported?
- Evidence producers could benefit from training in how to translate data into meaningful analysis. Partnerships with the private sector and development partners can provide a temporary solution to this challenge. While rigorous research can be supported through partnerships, stakeholders believe timely evidence production can be achieved through improving basic research skills in-house.
- Evidence intermediaries would benefit from learning how to differentiate evidence from data. While evidence libraries can help increase access to research, evidence should also be communicated and advocated in a way that is relevant to decision-makers now.
- Evidence users should be guided to make clear demands for evidence to ensure resources (e.g., financial and personnel) are effectively used to inform time-sensitive decisions.
Government stakeholders in our focus countries are committed to improving the evidence ecosystem, but “Evidence is useful when it is aligned to goals” (Brighouse et al., 2018). Lacking specificity in the goals of and demands for evidence can run the risk of wasting scarce resources on producing evidence that isn’t required or used. With stronger demands, evidence can also be more effectively communicated and engaged with. To support this endeavour, EdTech Hub is working closely with government partners to learn how evidence can be produced, communicated, and used in the evidence ecosystems of LMICs with varying degrees of fragmentation and centralisation.