Blog

Applying sandbox methodologies to address education policy and implementation challenges in Southeast Asia

A photo taken at the DepEd office in Manila following an M&E workshop conducted by the EdTech Hub team for representatives and technical staff from multiple ministry bureaus

This blog focuses on key findings from ‘Sprint 0’ – the first of a series of blogs that will follow our journey, sandboxing key education challenges with our partners across Southeast Asia. Through this series, we’ll unpack how we are learning by doing: testing ideas for Southeast Asian education challenges through real-world experiments, be it in the realm of policy or programme design.

Across Southeast Asia, education systems are tackling complex challenges — from shaping evidence-based policies to ensuring that learning truly reaches every student. Yet no two countries face these challenges in the same way. Drawing on work with colleagues in Ministries of Education in Indonesia and the Philippines, as well as a regional SEAMEO center focused on Technical Vocational Education and Training (TVET), this blog explores the distinct priorities each team is tackling through the sandbox approach. The sandbox provides a practical, iterative way to make progress on real problems. We work with our partners to test ideas, learn from implementation, and refine solutions — rather than relying on planning alone.

Through the ASEAN-UK SAGE programme, we are working with partners across the region to: identify and clarify problems, design and test ideas and assumptions about the specific challenge, and strengthen solutions through rapid, real-world experiments. With practical tools, expert support, and focused learning, the sandbox guides teams to turn promising ideas into evidence and insights that can guide better education policy and delivery.

Meet our partners and the specific learning goals they are working toward through a sandbox:

  • Ministry of Primary and Secondary Education (MoPSE), Indonesia, through the Centre for Education Standards and Policy (Pusat Standar Kebijakan Pendidikan, PSKP): Working to develop a standardised process that helps key units across the ministry implement a reliable evidence-based policy-making process that adheres to the National Administration Body’s policymaking guidelines.
  • Department of Education (DepEd), Philippines, through the Bureau of Learning Delivery: Designing data collection approaches for the Omnibus EdTech Policy to shift from data-for-reporting towards data-for-decision-making to tailor support for schools based on where they are in their digital transformation journey.
  • SEAMEO VOCTECH, a regional SEAMEO centre on TVET vocational training: Enhancing and optimising SEAMEO Voctech’s regional TVET training digital platform, SEA-VET Learning, to support platform engagement in self-paced learning and professional skill development across Southeast Asia.

Laying the groundwork for the sandboxes through ‘Sprint 0’

Each sandbox is composed of several few-week sprint cycles, which begins with a ‘sprint 0’ – the preparatory phase before beginning the main sprint cycles. Across the various sandboxes conducted with our three partners, a strong foundation was established through a series of focused preparatory activities. In sprint 0, the teams convened researchers, programme leads, and the ministry representatives to clarify the problem, identify any knowledge gaps, and surface early assumptions about particular interventions or processes. 

Next, we gathered the necessary resources, including tools, reference materials, and stakeholder contacts, to ensure teams were ready for the work ahead. Early alignment with the key implementing stakeholders was crucial for confirming objectives and securing buy-in from the start. These preparatory steps created a solid foundation for subsequent testing, validation, and iteration.

Key findings from Sprint 0:

For MoPSE in Indonesia:

The sprint 0 process helped us understand what really shapes how teams work and engage with evidence-based policy making. The EdTech Hub and implementing team began with an initial assumption that there were no standard processes for policy-making and policy-making units needed toolkits to follow. What we found instead was that some processes already exist and the real challenge lies in leadership buy-in, culture, and ways of working. These insights helped us adjust our approach to focus as much on people and collaboration as on tools and methods:

  1. The right respondents matter: prioritising depth over hierarchy

Interviews with senior leaders offered limited operational insight, so we pivoted to respondents from the units’ technical teams for more practical perspectives. A targeted survey expanded the pool, ensuring broader and more relevant input.

💡The lesson: choose respondents based on the depth of their involvement in particular processes, rather than position.

  1. Proactive role clarity secures MoPSE’s buy-in and collaboration

While the team in the MoPSE unit within the Center for Education Standards and Policy (Pusat Standar Kebijakan Pendidikan, PSKP) showed strong commitment to the sandbox’s success, it was a new way of working and meant that roles and expectations were unclear. In response, we clarified roles and responsibilities early and created an onboarding toolkit to align methods and expectations.

💡The lesson: confirming assumptions early and setting clear parameters for roles and responsibility to enable strong collaboration across the goals for the sprint.

  1. Leadership buy-in enables team alignment

We initially shared details of the sandbox initiative before securing official leadership endorsement, which led to resistance and mixed messages. We then ensured alignment with PSKP leadership before wider communication.

💡The lesson: visible endorsement from leaders is crucial to build consensus and sustain momentum

  1. Mitigating bias in qualitative insights

Our initial approach of handpicking interviewees created selection bias, leading to overly positive or incomplete insights. We adjusted by requesting supporting documents such as process maps and reports to ground findings in evidence.

💡The lesson: combining interviews with documentation gives a fuller, more objective picture of how processes really work.

For DepEd in the Philippines:

The sprint 0 process helped us understand the objectives of the Omnibus Edtech Policy, through the Digital Maturity Assessment (DMA) tool, from perspectives and engagement at the school level. Focusing on the feasibility and viability of the process of capturing data, we worked with the assumptions that 1) users convene and discuss to capture a holistic school context, 2) the tool is easy to comprehend and fill out meaningfully, and 3) self-reported data is reliable. We learned about underlying motivations for truthful self-reporting at the school level, inputs to improve the tool, and suggestions to streamline data collection processes. 

  1. User experiences are valuable in ensuring that data collection tools are effectively designed to meet policy objectives.

Focus group discussions with school heads and teachers on the field allowed us to gather deeper insight on how the DMA tool is perceived and used for school-level data reporting.  Their specific and detailed observations not only revealed the tool’s strengths and areas for improvement, but also led us to revisit other areas of the policy and its development process.

💡The lesson: Feedback from end-users provides a more nuanced understanding of how tools can effectively and purposefully support data-gathering.

  1. Understanding school-specific contexts helps create a more inclusive and equitable approach in data accuracy and decision-making.

Through the sprint, it became clear that the DMA self-reporting tool must reflect the wide variation across schools, including differences in resource availability, infrastructure, and the digital competencies of teachers and learners. These variations directly influence how schools report their digital maturity. Without acknowledging these contextual differences, lower maturity scores risk being misinterpreted as a lack of effort rather than the result of structural constraints. The school visits provided insight into how these conditions shape reporting practices and digital readiness profiles.

💡The lesson: Embedding school-level context into data capture leads to more accurate interpretation of maturity levels and supports fairer, more actionable decision-making. 

  1. Policy tool development is an iterative process that validates assumptions, ensures clarity, and strengthens targeted decision-making.

Information gathered at different stages from different stakeholders is used to feed back into the policymaking process. This mechanism reinforces the policy’s goal to be centred around human needs and to be integrated with other EdTech efforts.  The sprint also allowed us to rethink the feedback loop to schools and other stakeholders as a means of strengthening the policy. In addition, we identified opportunities to leverage existing datasets to reduce duplicated data collection efforts and ease the reporting burden on schools.

💡The lesson: There is progress that results from iteration, as reflected in clearer definitions, stronger coordination at different governance levels, and a more cohesive policy narrative.

For SEAMEO VOCTECH:

Sprint 0 helped us become familiar with the SEA-VET Learning platform and VOCTECH team. We were able to access both backend analytics about the platform as well as internal team documents, which provided us with many insights about the current status of users’ engagement with the platform. Throughout Sprint 0, we found that open and consistent communication with our VOCTECH partners was essential to align efforts, prevent duplication of effort and focused on providing a unique value. 

  1. Early partner coordination ensured our work built on existing insights and avoided duplication.

Our initial plans included gathering insights from users and key stakeholders through focus group discussion. However, we learned that VOCTECH had conducted their own deep dive into SEA-VET Learning earlier in the year, identifying gaps and strategies for improving engagement. We were able to review VOCTECH’s documentation from their internal workshops and used these to generate our theory of change and assumptions, rather than repeating processes.

💡The lesson: Early coordination with our partner helped us identify gaps and ensure our activities built on existing work.

  1. Clear communication helped align efforts and ensure sandbox activities complemented, rather than duplicated, ongoing work.

We realised partway through the sandbox design phase that VOCTECH was also in the process of developing strategies to address the sustainability of the platform with other partners. When we recognised there was potential for duplication of effort, we refined the sandbox to focus explicitly on the course design and presentation.

💡The lesson: Having clear communication with partners and awareness of parallel projects is essential to ensure the sandbox activities are complementary rather than duplicative.

  1. Understanding user behaviour helped us refine how we measure success.

In our initial review of the platform analytics, we noted low engagement metrics, like the number of minutes spent on a course. Through discussions with the VOCTECH team about our findings, we learned that lower engagement metrics seen on web analytics could be misleading, as learners may only use the platform to download course content.

💡The lesson: There are multiple ways to measure an output, and we need to ensure the way we validate our metrics for success.

What happens next?

Now that the teams have a clearer understanding of the problem, the next step is to turn that understanding into action. Sprint 1 marks the beginning of a series of short, focused cycles where teams test ideas in the real world and reflect on what they learn before moving forward. Each sprint is time-boxed, meaning it runs for a set period with clear goals and boundaries — allowing teams to make progress quickly without getting stuck in long planning phases.

In this first sprint, teams will focus on identifying what needs to be true for their ideas to succeed and the assumptions that carry the most risk if they turn out to be wrong. Instead of trying to fix everything at once, they’ll design small, practical experiments to test these assumptions first. These experiments will produce early evidence about what works and what doesn’t, guiding smarter decisions about where to focus time and resources. By shifting from discussion to hands-on testing, teams start learning through doing, building confidence in their ideas and laying the groundwork for solutions that can grow and last.

Want to find out more about the sandbox methodology?


Acknowledgements

Thank you to our partners from DepEd in the Philippines – Director Gerson Abesamis and the team from the Bureau of Learning and Delivery, MoPSE in Indonesia – Dr. Irsyad Zamjani and the team from the Center for Education Standards and Policy (PSKP), SEAMEO VOCTECH – Dr. Paryono and the team from the SEAVET learning platform, and to all those at EdTech Hub engaged in to the Sandbox work – Aprillia Chrisani, Resiana Rawinda, Delanie Honda, Nawaz Aslam, Jamie Donato, Gita Luz, Nimra Afzal, Sangay Thinley, Jazzlyne Gunawan, Haani Mazari, Laila Friese, Jillian Makungu, and Sophie Longley.


This publication has been produced by EdTech Hub as part of the ASEAN-UK Supporting the Advancement of Girls’ Education (ASEAN-UK SAGE) programme. ASEAN-UK SAGE is an ASEAN cooperation programme funded by UK International Development from the UK Government. The programme aims to enhance foundational learning opportunities for all by breaking down barriers that hinder the educational achievements of girls and marginalised learners. The programme is in partnership with the Southeast Asian Ministers of Education Office, the British Council, the Australian Council for Educational Research, and EdTech Hub.

This material has been funded by UK International Development from the UK Government; however, the views expressed do not necessarily reflect the UK Government’s official policies.

Connect with Us

Get a regular round-up of the latest in clear evidence, better decisions, and more learning in EdTech.

Connect with Us​

Get a regular round-up of the latest in clear evidence, better decisions, and more learning in EdTech.

EdTech Hub is supported by

The findings, interpretations, and conclusions expressed in the content on this site do not necessarily reflect the views of The UK government, Bill & Melinda Gates foundation or the World Bank, the Executive Directors of the World Bank, or the governments they represent.

EDTECH HUB 2025. Creative Commons Attribution 4.0 International License.

to top