Opinions, Behaviours, and Frustrations: Lessons on Teacher Technology Use in Pakistan

Technology uptake is not trivial, contrary to what some in Pakistani policy circles may believe. We have heard ministers excitedly talk about how they think technology can help ‘leapfrog’ and how it is the panacea to wide-scale educational problems.  Think about the last time you shifted from one platform or technology to another and changed the way you routinely did things, no matter how small the shift was. Unless there is a big enough incentive or a major efficiency gain, most people remain reluctant to adopt new technology. This is even more true for populations that do not live or work in settings that are already digitized.

This blog post zooms in on an innovation – a teacher technology-support tool (referred to as the tech tool) and early insights into its uptake – that has been introduced in the Targeted Instruction in Pakistan (TIP) program currently being piloted in 1,250 primary schools in the Mardan and Peshawar districts of Khyber Pakhtunkhwa province of Pakistan and supported by the Research on Improving Systems of Education (RISE) program. We will toggle between TIP and TIP-KP to refer to this program for the rest of this blog post. Another version of this program, supported by the EdTech Hub, is set to launch soon in over 500 public primary schools in the Islamabad Capital Territory (ICT).

Before proceeding further, a quick introduction of Targeted Instruction (TI) is warranted. However, note that this piece is not about the difficulties of adoption of the entire TI approach (whether tech-enabled or not). It is focused on the adoption of tech in the context of TI.

We also want to acknowledge that there are massive literatures on teacher adoption of new teaching practices and rich literatures on people’s adoption of technology generally. Our contribution through this blog post is related to this program only.

What is TI?

Around the world, Targeted Instruction interventions are systematic approaches to helping children catch up to grade level by filling in learning gaps in foundational skills. Rather than delivering standardized content to all students in a grade, a typical TI intervention assesses every student’s actual learning level and identifies the key concepts that are holding each child back. The program then sorts students across or within classrooms into different groups according to learning levels, and teachers deliver targeted content that is most appropriate for each learning level. After the program is completed, students return to their classes with the foundations they need to tackle more advanced content, allowing them to continue their learning journey through primary school and beyond. TI interventions can be conducted with or without technology. However, tech is particularly well suited to increasing efficiency in TI style programs by facilitating the diagnostic and targeting processes.

Why TI? 

Data from domestic assessments tells us that primary school students are not learning at grade level, and COVID-induced closures have made matters worse. 

Figure 1 : Pakistan’s learning crisis

There is growing recognition that children are not learning at grade level and its long-term ramifications. However, the public education system is not designed to address it at the risk of compounding learning losses as they build on shaky foundations. Focusing on ‘foundational learning’ is one solution to this perennial problem; and targeted instruction programs are designed to address foundational learning gaps. 

How is TIP-KP different?

Overall, the TIP-KP model uses this basic composition described above; see Figure 2 below for a visualization of the learning process that happens in TIP classrooms. However, what makes this program unique is that it was designed with specific system-level frictions and constraints in mind and uses context-specific approaches to address these constraints, making it especially suited for deployment in Pakistan.

Figure 2: TIP-KP model

When we set out to design TIP, we knew that for a new intervention to work and scale efficiently it would have to address key constraints in the system at all levels. In our case this meant getting buy-in from:

  • Policymakers/ senior bureaucracy to put on hold the regular more complex curriculum for a few months (only in pilot schools) to allow teaching to address foundational gaps. This was even trickier as the government has recently introduced an updated curriculum 2020 which has been criticized for over burdening primary grade students. 
  • Overburdened head teachers and teachers who are not accustomed to deviating from the academic calendar and are evaluated on pace of syllabus completion (not student learning outcomes) due to a rigid final exam schedule. 

To address the second constraint – i.e. to reduce administrative burden on teachers to make it easier to integrate innovative pedagogical practices with existing ones – TIP departs from traditional TI models in two unique ways:

  1. TIP is designed for existing public-school teachers to implement during regular school hours.  It does not require hiring additional staff or adding school teaching hours – both attributes make it a more likely candidate for scale compared to alternative approaches.
  2. It leverages low-cost, teacher support technology to empower teachers to improve student learning outcomes. The TIP program introduces a low-cost technology tool (an app that runs on a smartphone or a desktop/tablet) to reduce the administrative burden of implementing targeted instruction and help teachers efficiently conduct time consuming activities – repeated testing, grading, sorting, and tracking. This is different from other education technology interventions that require costly new devices. Our tool can even function offline after an initial download. 

What is the TIP tech-tool?

While there is a plethora of learning management systems (LMSs) available for schools with many similar features, the TIP tech tool has been co-designed by local education experts and engineers with regular input from schools and teachers, specifically for use in low-resourced, low-digital literacy settings, such as public and low-cost private schools in Pakistan. 

The TIP tech tool serves two main purposes (see Figure 3):

  • It automates the most tedious parts of TIP implementation, i.e., it is a fast-grading tool for teachers. Teachers enter student diagnostic data, and the app determines their respective learning levels, sorts of children by grade level ability and assigns an instructional plan for each group. This saves enormous amount of time that otherwise goes into grading student test papers and creating manual results.
  • It is a repository of materials that the teacher may need to implement TIP – including teacher training materials, instructional materials such as lesson plans and associated teaching aids, videos illustrating various activities to be implemented in the classroom, and a library of diagnostic, formative, and summative assessments.
Figure 3: TIP tech tool – some screenshots to showcase app features

This blog post is not focused on the features of the TIP tech tool or its (software) development process. This piece is also not about the uptake and success of this tech tool in TIP implementation in KP as that data will only become available after the first cycle of the program ends somewhere close to the first quarter in 2023. What this blog post does focus on is some preliminary insights into ‘tech adoption behaviors’ of the primary beneficiaries of the TIP tech tool, i.e., the primary school teachers in TIP pilot schools. These insights are both valuable from a policy perspective and are used to inform the next round of TIP implementation, with an expanded set of research questions, in the ICT.

Who the tech tool benefits – profiles of a typical teacher 

To understand how to address teaching and learning challenges, we describe three profiles of typical prospective TIP teachers included in the TIP pilot sample. We focused on providing a tech solution that addresses the needs of a large majority of teachers, although we realize that wide variation in levels of connectivity across school communities means it is not possible to develop a one-size-fits-all solution. While all of these examples are from teachers in KP, we find that teachers in the ICT have similar profiles and challenges.

Early insights into tech adoption behaviors of teachers

We tested the TIP program and the tech tool with many teachers similar to Arifa, Babar, and Noor Alam. Through both qualitative and quantitative collected during product beta testing, a teacher training workshop with almost 5,000 teachers, and baseline surveys in 1,250 schools, we have gathered preliminary insights into tech adoption behaviors of TIP pilot schoolteachers. Teacher feedback, behaviors, and especially frustrations during this testing phase enabled us to gain some really useful insights into what may make or break adoption of the tech tool with our target population. We summarize these insights using four categories described below. We hope to collect further rich insights as the program rolls out in August 2022 and a first cycle is completed and as the program is piloted and rolled out in the ICT.

  1. Old habits die hard

Teachers find comfort in maintaining paper records. We spent PKR 25 million on printing lesson plans and testing tools for teachers and students at 1,000 TIP treatment schools. That money could have been saved or spent on facilitating the tech tool usage had teachers been open to using electronic versions of these materials in the tech tool, in place of paper. They all asked for paper copies as well and took a long time to adjust to using only technology.

  1. (Only) Seeing is believing

Teachers are reluctant to rely on what they can’t touch and see. During training, teachers remained reluctant to use tech directly to enter grades. For fear of losing their marking record to technical glitches as well as because they were unsure about how their students might be ranked correctly by the sorting tool. So, they performed marking on paper and calculated total scores manually, instead of entering marks in the tool and letting the app do the calculations for them.

One teacher agreed to try doing it the way it was recommended. After she finished, she excitedly exclaimed that it hardly took her a minute to do what the others were spending considerable time on. After hearing her, several other teachers said they wanted to try it too and realized that it saved them a lot of time and mental effort. They also validated calculations made by the app with their own paper-based calculation which gave them confidence in the accuracy of the app. So, peer modeling and training helps, although whether this behavior will persist remains to be seen. During a teacher training workshop, the experiential session on manual versus automatic grading and sorting of students was particularly helpful to teachers to help them realize the value of the tool.

  1. Ease of app access and interface may play an important role in determining uptake

Remembering the username and password becomes an unnecessary hurdle to adoption. Teachers kept forgetting their account information, found it hard to connect to a network or got frustrated when the network was poor, and data did not sync or download properly. All these small things pile up and create a barrier to adoption. However, the WhatsApp-inspired easy-to-use user interface of our Tech Tool has helped even more senior and less tech savvy teachers easily navigate the app once logged in.

  1. Incentives, prompts, and peer support may help uptake

Teachers are likely to be hardwired to do only what is necessary to get the task at hand done. For example, during beta testing when teachers found out that they are only required to enter the diagnostic results for the app to sort students, they only entered diagnostic data and skipped checking off completed lesson plans and entering scores from quizzes. They would just check quizzes and do manual grouping for revision lessons, which took time. So, we added the ‘revision peer group formation’ feature in the app as an incentive to enter quiz results.

Similarly, we observed that teachers were only opening the app to enter required testing data and were not using other materials (lesson plans, training materials) in the tech app. We have now added visual prompts such as the watch videos instruction in red font as shown in the photo. This serves as an additional incentive for teachers to open the lesson plan module, despite having physical copies of the lesson plans.

Ultimately, most teachers understood the value of our tool. Some teachers who did not have personal smartphones at the time of training had even decided to purchase devices just to be able to use the tool. Others, in the control group with no technology access, persistently reached out to gain access to the tool to conduct the diagnostic activity.

We realized that once the program is rolled out, teachers will still need some nudging to use the tech tool. So, we assigned one proactive teacher in each school as a ‘Tech Captain’ – someone who would make sure that teachers add data to the tech tool when they are required to do so as well as help resolve technology related issues such as helping with logging into the app or connecting to the Internet.

What’s next?

Once the TIP-KP program rolls out in August 2022, we expect to answer, with evidence, the following questions and more:

  • The actual teacher and school adoption rate of the tech tool.
  • Enablers and barriers to tech usage and how that may correlate with teacher and school demographics.
  • User data to make any modifications to the tool.
  • Impact of teacher tech usage on student learning outcomes

The evaluation of a version of this program in the Islamabad Capital Territory is supported by the EdTech Hub and zooms in on the role of technology in distributing content equitably, sustainably, and cost effectively. We expect to answer a broader series of questions even beyond technology uptake, including how empowering teachers and parents with information and guidance through technology can help their children improve their learning. We would like to thank all our collaborators for supporting these projects, including RISE, the World Bank, the EdTech Hub, and the Ministry of Federal Education and Professional Training. 

Connect with Us

Get a regular round-up of the latest in clear evidence, better decisions, and more learning in EdTech.

Connect with Us​

Get a regular round-up of the latest in clear evidence, better decisions, and more learning in EdTech.

EdTech Hub is supported by

The findings, interpretations, and conclusions expressed in the content on this site do not necessarily reflect the views of The UK government, Bill & Melinda Gates foundation or the World Bank, the Executive Directors of the World Bank, or the governments they represent.

EDTECH HUB 2024. Creative Commons Attribution 4.0 International License.

to top