Written by Alice Carter
This is the second in this series about our sandboxes. If you haven’t already, read about our approach to experimentation.
Across the Hub, we’re proponents of using our tools and approaches on our own thinking. For innovation, that meant testing the assumptions we were making about how and if the sandboxes would work. So, just as we would advise entrepreneurs or policymakers we’d support, we:
- defined our goals and the hypotheses we have about how we’ll have the desired impact.
- separated our assumptions from facts, and prioritised our assumptions from most to least critical. The most critical assumptions are the ones that would undermine the entire strategy if invalidated.
- tested those assumptions in the real world, with real people. This is an important aspect of how we will continue to work. For instance, instead of asking people whether they’d be interested in something, we will strive to make it available to them and gauge genuine interest through real takeup.
- worked fast and light, always asking: what can we do tomorrow? Our work is designed around learning, so we took the fastest route to learning the most.
- were led by insight, pursuing what works, eliminating what doesn’t, capturing new ideas and fresh thinking along the way.
We tested our sandbox thinking by running a mini-sandbox (a trial run) in Malawi in the last few months of 2019. Our trial was a simplified version of a sandbox. Here is how it compares with the future, fully-fledged version we’ll be launching later this year:
|Mini sandbox||Future sandbox|
|Location selection||Selected Malawi, based on existing networks||Open calls for governments and ministries to apply to take part|
|Technology selection||Closed competition||Open competitions|
|Duration||4 weeks||Up to 2 years|
|Training||1 week||3 months|
|Sharing learning||Reflections shared publicly||Detailed insight and learning shared publicly|
In September 2019, we launched a closed competition to a few tech entrepreneurs working in education in Malawi. The selection panel chose Padziwe, a Malawian company developing tech such as ‘Teachers Desk’, a continuous professional development (CDP) application for teachers in Malawi.
Teachers Desk is a platform for teachers to continue their training, engage with fellow teachers, and catch up on important developments in the teaching profession. It contains engaging modules on teaching approaches such as learner-centred, makes curriculum content easily accessible to teachers, and allows them to take part in discussion forums.
Not only does the Hub believe that teachers continuous professional development is one of the more interesting areas for technology to play an important role for education, the team at Padziwe demonstrated a real openness to learn and work with us on this alpha.
Making the most of 4 weeks
We concentrated a two year programme down into a four week-long test. This meant we had to design activities that would help us gain insight into our process in the shortest amount of time possible.
We chose to do one week of immersion, planning and training with the Padziwe team; followed by three weeks of remote coaching and support from the Hub. Drawing on methodology from Lean Impact, we worked with Padziwe to identify all the assumptions being made within their idea. We then worked in sprints to test these assumptions.
Sprint is a term used in Agile project management that focuses work into short dedicated bursts, with a shared expectation of what will get done in that burst. Each burst is focussed on a key learning outcome.
The first weeklong sprint is outlined below. Since this was a bit of training and working all at once, we dedicated each day to one of the three types of hypotheses in lean impact: value, growth, and impact.
|Monday||Introducing Lean Impact and sprints|
|Tuesday||Value: spending a day with teachers to understand which aspects of the product they value or not|
|Wednesday||Growth: a session with donors and others who might procure the product longer term|
|Thursday||Impact: a consultation with colleagues at the REAL Centre at the University of Cambridge, to understand which proxies we might use to measure impact and benefit from their tacit knowledge of products operating in this space|
|Friday||Sprint review and reflections|
Four weeks later…
Here are the headlines on what the Padziwe team was able to achieve in these four weeks. There’s more information on the post from Pilirani, Padziwe’s founder and CEO.
Do teachers value Teachers Desk?
Work that sought to answer this question resulted in a useful pivot for the Padziwe team. After testing ideas with teachers, it was clear the product needed to shift from pedagogy-only materials to subject-specific content.
Can we build a commercially sustainable business model by working with a partner?
Over the course of the four weeks, the team tested ideas and possibilities directly with would-be funders and partners, and as a result, were able to secure a long term project with an international non-governmental organisation. The team also secured a letter of support from the Malawian government to help them attract more investment and interest.
Are we making a difference to learning outcomes?
Through consultations with REAL Centre colleagues at the University of Cambridge, the Padziwe team tested their assumptions about how they might best track a positive impact on learning outcomes. Proxies for these outcomes included teachers’ ability to identify their own knowledge gaps, the use of effective teaching practices, and both teacher and student confidence.