Opinion: Why all research programmes should have a helpdesk
Editor’s note: This is an Opinion piece and reflects the personal views and experiences of the author. It does not directly represent the EdTech Hub. We are excited to feature diverse perspectives from the sector and look forward to more conversations.
Introducing evidence windows and helpdesks
Too often in international education, and across international development more broadly, evidence created by research teams is never used by policymakers. These studies sometimes do not align with the national priorities or are just not available at the right time for policymakers. We need to do everything possible to ensure this isn’t the case.
Helpdesks are one of the best tools we have available to ensure that research is used, and evidence windows are met. Over the last six years, I have been an FCDO staff member seconded to the EdTech Hub. This has given me lots of opportunities to interact with helpdesks, both as a helpdesk user and as part of the EdTech Hub helpdesk team responding to requests from World Bank, UNICEF, and FCDO staff.
In this blog, I outline the top three reasons that I believe helpdesk can be a game changer for the uptake of research in international development. Importantly, each of these reasons is relevant to overall programme effectiveness, not just the helpdesk component. I also take a brief look at which organisations are well placed to run a helpdesk, and then present a list of ‘top 10’ list elements of what I think makes a helpdesk successful.
Evidence windows
For evidence to be used by a policymaker to influence a decision, it needs to be delivered to a policymaker in the right format at the right time. I refer to this as the ‘evidence window’ for research uptake. For research to be used by a policymaker through an evidence window it needs to:
- answer the right questions (for the policymaker)
- be delivered in the right format (for the policymaker)
- be delivered at the right time (for the policymaker)
- be delivered to the policymaker by the right people (for the policymaker)
- be delivered in the right way (for the policymaker)
Meeting the evidence window ultimately leads to improved uptake of evidence into policy and the real world. Evidence windows are opened and closed by a wide range of factors, such as political pressures, development partner financing processes, or personal ambitions.
Research and evidence helpdesks
A ‘helpdesk’ is a service that is provided by a team of technical specialists, to help a specific set of users deliver on particular goals. Key defining features of a helpdesk are being open to receiving requests within a specific scope, providing robust products, and delivering within a defined time. The service offered to users can be small or big — ranging from a short telephone call for advice to a considerable piece of technical assistance covering multiple months.
For example, the EdTech Hub helpdesk offers assistance to all FCDO, World Bank, UNICEF and Gates Foundation staff who are able to submit a request through a password-protected online form, and the hub team responds within 48 hours and then follows a set process. The helpdesk offers:
- The following products and services: Expert Consultation, Programme Document Review, Topic Briefs, Curated lists of Resources, Workshop Facilitation and Participation in an Event.
- The products and services are offered across a wide scope including: Digital personalised learning, Teacher continuous professional development, Data for decisions, Participation and messaging, Girls’ education, EdTech / digital learning / ICT in education strategy.
- The helpdesk is ‘staffed’ by a core team of EdTech specialists, as well as pulling from specialist expertise.
Anonymised example request:
The Ministry of Education needed help to leverage the best EdTech evidence, in order to help secure a new World Bank loan. On behalf of the Ministry, the World Bank Task Team Leader (TTL) requested support from the EdTech Hub. In less than a week, EdTech Hub got together a small team of consultants — all based in the country or the region. EdTech Hub’s team provided an evidence review to the Ministry, in a straightforward PowerPoint format, to help bring the evidence base to their internal discussions. This was then followed by a series of expert consultations and a final document review. The result was that the Ministry was able to get the right evidence, in the right format, at the right time and from people they already trusted. This led to a significantly improved EdTech proposal.
Why are helpdesks a game changer?
1 – Helpdesks force research programmes to design products and services directly for global south policymakers — improving local ownership and use of research
Building a helpdesk forces research programme teams to design and deliver a genuinely user-centred service. This helps shift the power to the hands of Global South policy advisers and policymakers.
In the real world of decision-making, the evidence windows are often short. There is often a very brief time window to help nudge a policy, strategy, business case, appraisal document or law. And the time window for these changes is decided by policymakers’ own internal processes, not by the speed at which evidence is produced. It’s often unclear whether the evidence window will actually lead to success, and can be seen as part of a ‘development gamble’. Sometimes evidence windows pass without evidence being used to inform decision-making at all.
Donor funded international development programmes have a complex mix of incentives. These often range across contractual deliverables, donor priorities, beneficiary Government priorities, and operational necessities. The result is that without a user-centred response mechanism such as a helpdesk, or another means of incorporating policymaker priorities into research agenda-setting, research programmes may not deliver directly for the policymakers who are the most important potential consumers of the research. Often, the result of the bureaucratic incentives is that research programmes deliver chunks of evidence to meet milestones and contractual requirements but miss the evidence windows of sporadic requests from real world policymakers and decision-makers. For example, when a government is writing a proposal on a specific topic and needs a document review turned around in 48 hours, it’s unlikely that a typical donor-funded programme team will be able to drop all their current work to do this. Stepping back and looking globally, the result is, these policy windows are constantly opening and closing too fast for donor programmes to respond.
This is where a helpdesk changes the programme incentives. Contractual requirements are linked to the provision of this service, and positive feedback is a metric, so programme teams are incentivised to prioritise these messy, but fruitful requests. It means that programmes need to put in place technically capable multidisciplinary teams that need to deliver a service iteratively, with a clear focus on their users. Helpdesk teams must also be well-informed and constantly develop their awareness and understanding of the research in their field. This understanding of longer-term research being undertaken, especially by the research programme in which they are embedded, enables them to spot matchmaking opportunities between forthcoming research, and evidence windows among policymakers who can use the findings.
Questions that Helpdesk teams should be asking?
- What products or services do the helpdesk offer that actually deliver value to users? Very importantly, the ‘user’ in this case must be a core user of the service, and not the staff member who is funding or overseeing the helpdesk.
- How much money is assigned to each request, and how will this deliver maximum value to the users?
- What timelines do you users typically need responses? What is the best format for users to request help from the helpdesk? This will need to trade off ensuring the user is requesting something the helpdesk can deliver (e.g. funnelling the request), with the best format for the user to deliver.
2 – Helpdesks push research teams to understand what they are not good at, and what their comparative advantage is
It sounds obvious, but if you are going to deliver a helpdesk service, then you need people who can actually help (and not harm) a situation. That means your helpdesk team needs to understand the comparative strengths and weaknesses of the technical staff within the programme that runs the helpdesk. Often this is something that no single person in a programme actually has a view on. Where are your people the best-in-class leaders? And where are they just competent?
Being able to understand the comparative advantage of the people in a programme, doesn’t just allow the helpdesk to succeed, it strengthens the overall programme to find its best place in the ecosystem to deliver the most value. This forces a continued look at where the programme has expertise, and where the expertise is needed. In other words, continually refining a ‘product market fit’.
Although this seems obvious in some senses, in practice I find programmes are generally poor at saying what they are not good at1. As programme teams or donor partners, we need to be brave enough to clearly state what programmes will not be able to do at the earliest possible opportunity.
Questions that Helpdesk teams should be asking?
- Who in our organisation can technically respond to different types of requests, and what is their likely availability?
- What is the scope of the helpdesk?
- What will the helpdesk not do?
- Helpdesks force research teams to develop agile operational structures that are efficient and lean.
In my experience, ‘operational delivery is international development’. You simply can’t separate the operations from the technical capability to deliver impact. It doesn’t matter if you have the best technical team in the world, if they are hampered by slow and/or bureaucratic operations which don’t allow the team to act at the right time
By requiring the development of a helpdesk where success is measured by delivering based on the needs of users, it forces swift and lean operations due to the rapid nature of requests.
For example, if there is a request for a review of a government policy in two weeks, then there is no point trying to respond if the only option is a procurement to find an expert that takes two months. If the helpdesk KPIs mean the helpdesk performs badly by missing these opportunities, then it will force teams to focus on delivering efficient and lean operations.
My experience is that if each individual helpdesk request requires the helpdesk team to run a procurement process to find someone to respond to the request, then the helpdesk will inevitably be too slow to respond to most evidence windows. Because of this, it is important that helpdesks do not rely on using a lengthy procurement process after a request has been received. To get around this, the helpdesk needs a set of experts it can call on to respond to the request without contractual delays. EdTech Hub created the specialist network, with regular open calls for experts to solve this challenge.
The helpdesk team needs to find ways to ensure they can move quickly, whilst still adhering to contractual and donor policies. This isn’t an easy task, but in my experience it’s the difference between success and failure in a programme. These operational processes and approaches can then be used more widely across the programme – strengthening it in a wider sense.
Questions that Helpdesk teams should be asking?
- How much does it cost in administration to respond to a request?
- What happens when people are on leave?
- What are your actual response timelines and how is it tracked?
10 principles for a great helpdesk:
- Helpdesks need a great helpdesk manager — who is empowered to deliver. This person will need to have both a broad and relevant technical expertise, be a management wizard able to lead people and teams, not be above doing menial tasks where needed. Finding this person will not normally be easy!
- Helpdesks must be open all the time. There is no point putting out specific calls for people to apply for the helpdesk, as it means the evidence window is almost certainly missed.
- Helpdesks must have a clear technical and geographic scope. No helpdesk can cover an entire sector or across sectors, in all geographies. To be useful it should be clear about the technical and geographic scope, and importantly what is not normally in scope.
- Helpdesk must have a clear list of products and services offered. Each product and service should include a specific timeline to set expectations all round. This includes when you will be first contacted by a human, and when the first scoping call should take place.
- Helpdesks must have the right staff to deliver against the scope, products and services. The scope should be changed to reflect the staff and expertise, not the ambitious goals of a business case.
- Helpdesks should be free at the point of initial delivery. Budgetary and finance processes are too slow to respond to evidence windows. Therefore, the initial work on the helpdesk should have a central budget attached to it from the research programme. However, it normally would make sense for helpdesk users to have the option to ‘buy-in’ to more enhanced helpdesk services in a way that the ‘free’ offer lasts until paid for services are in place, to give continuity of support.
- Helpdesks should not need to run a procurement to start work. To meet the evidence window, they need people contracted already.
- Helpdesk teams themselves should decide which requests to accept, within the service scope. Where possible this shouldn’t need donor approval, unless a very complex request. Otherwise, the service will be too slow to be useful.
- Where possible, helpdesks should be directly open to Global South government officials, and not just to donor partners. EdTech Hub wasn’t able to test this directly, but often it was indirectly the case where development partner staff submitted on behalf of an official.
- There are always exceptions to the rules. Focusing on delivering value quickly for helpdesk users should be the overriding goal.
A final thought on who should run a helpdesk:
Not all organisations are well suited to run a helpdesk. The wider the scope of the helpdesk and the more products offered, the harder it is to run, and the less organisations able to run it. For example, government development partners or UN bodies are unlikely to be well placed to run a service like this internally (without a significant number of permanent staff dedicated to the service). Any organisation which isn’t focused on lean and efficient delivery is also probably not suitable. However, it doesn’t matter whether a programme-led organisation runs the helpdesk, or a specific helpdesk organisation is subcontracted as long as they are given the mandate to deliver. In fact, I think it would be workable for a single helpdesk to sit across multiple programmes, but only if there are clear performance metrics defined for each programme in relation to the helpdesk.
- This is often driven from donor partners, who spend years securing finance by saying their programme will do all things for all people, then procure a real-world team that can do some things particularly well and other things not so well. That’s fine, but the expectation gap is often very large and leads to issues. ↩︎
With thanks to the following for helping review and develop this blog: Caitlin Coflan, Rudolph Ampofo, Ian Attfield, Sarah Lane-Smith, David Hollow and Laila Friese.