Making use of learning assessment data: findings from six sub-Saharan African countries

24 February 2021

ghana_pic_ulad.png

fivepointsix / Shutterstock.com
Teenage schoolgirls in a classroom in Ghana.

Learning assessments have become an important fixture of 21st century education systems. From international large-scale assessments to citizen-led tests, many instruments strive to gauge how well students learn. But, what happens with all this data, what are the risks, and does it influence educational policy-making and planning?

An IIEP-UNESCO research project is exploring these questions in sub-Saharan Africa and Latin America. As part of this, IIEP has just published six policy briefs and information sheets, providing an in-depth picture of the use of learning assessment data in the planning cycle in the Gambia, Ghana, Guinea, Namibia, Senegal, and Zambia. The policy briefs also include recommendations for national policy-makers, planners, and international actors – an element that holds much potential for guiding education actors in making better use of learning assessment data.

“One of the most striking findings of this research is that countries across the board are underusing learning assessment data compared to the potential that it has to improve policies that could in turn raise learning outcomes,” says IIEP associate researcher Ieva Raudonyte. 

What is holding back the use of learning assessment data? Here are four key findings from IIEP’s research:   

1. Consistently talk about learning assessment data

Lack of or unclear communication often blocks the use of learning assessment data in policy-making and planning. “Communication on learning assessment data needs to happen in a timely manner so that it feeds into the planning cycle, at the right moment,” says Raudonyte. And, it should not stop there – the assessment report must include actionable recommendations and planned follow-up with all the different actors involved in assessments, even if they sometimes have different agendas. “Through the research, we saw a confrontation between technicians who produce learning data and the policy-makers who need to make decisions on that data,” says Raudonyte. “It is not always presented to them in a reader-friendly way.” The research also identified challenges around the dissemination of data between centralized and decentralized levels in education systems.

2. Ensure adequate technical capacity

Education actors need to have the technical capacity and skills to analyze –and use – learning assessment data. This is essential to guarantee their ownership of the assessment processes. “National ownership of learning assessments is indeed key,” says Raudonyte. “When national actors develop the skills needed to analyze and make better use of the data, they can become the internal engine driving the use of assessment data.” In this sense, technical capacity development – as an endogenous process – can help develop and improve the entire learning assessment culture in a more sustainable and effective way. The research also found that a high level of turnover in assessment units can result in a lack of appropriate technical skills, and increases the need for training of new staff. 

3. Pay attention to the level of analysis

Learning assessment data can open the door to many valuable insights about who is learning, and where in a given country. “However, if we expect policy-makers to use learning assessment data in a meaningful way, it has to respond to their needs and be disaggregated, whether that be by region, gender, or other students’ and teachers’ characteristics. It needs to carry a certain level of analysis,” says Raudonyte. This is also where collaboration and capacity building come back into the picture – the right level of analysis will not suffice without fostering a culture of learning assessments. However, all of this takes time to develop: “Learning assessments are not just a mechanical exercise in the background.” 

4. Find balance in how learning assessments are financed 

Countries need a mixture of funding sources for learning assessments. This means not only relying on external funding sources, but steadily increasing the share of national resources invested. This can have a major impact on the ownership of learning data and actors’ autonomy to manage it – two major factors in helping ensure that decision-makers receive meaningful data to help shape policies. The research also highlighted the need for financial feasibility when defining recommendations based on learning assessment data.

What is the link with learning? 

Learning assessment data holds huge potential. However, Raudonyte warns that good learning assessment data does not automatically translate into good educational policies. ”Building policies to improve learning is a complex process,” she says. “Data from learning assessments is one important input that, if used appropriately, can contribute to better informing policies on learning.”

The research findings from Latin America will also be available in 2021, helping to facilitate discussions across continents.