What Can We Learn from Field Experiments on Media, Communication and Governance?

BBC Media Action
"What constitutes good evidence is currently subject to intense debate within the international development community. This briefing shares the findings of a review of such methods used to evaluate the impact of media and media assistance on governance outcomes."
This briefing is based on a review by Devra Moehler, assistant professor at the Annenberg School for Communication, University of Pennsylvania, United States (US). (See related summaries below.) It describes the review of nine field experiments and several sets of quasi-experimental studies that revealed that, "while media initiatives have led to positive governance outcomes, including improved accountability, they have also at times had unexpected adverse effects."
The briefing details background on governance field experiments, a potential tool in an evaluator's toolkit, concentrated within the sectors of elections, community and local governance, and service delivery, leaving media assistance largely unaddressed. Of the nine - three of them in India, Mozambique, Vietnam - "two studies showed the positive effects of media on informed voting, which is thought to lead to better governance and improved accountability. However, others had mixed or negative effects." Field experiments with radio listening groups were conducted in Rwanda, the Democratic Republic of Congo, and South Sudan to see if programmes had impact on behaviour change around peace and reconciliation and, again had some mixed findings. "Divergent findings highlight the dangers of generalising from one type of programme or topic in one country to all types of programming and topics in all countries. The review underscores the need to articulate our theories of change clearly and test our underlying assumptions about media effects. If our assumptions about how media affects governance are incorrect, then so too will be media interventions to achieve governance outcomes."
The review highlights an opportunity for more experimental research and identifies that the complexity of media development may hinder the efficacy of experimental evaluation. "To strengthen the evidence base, practitioners, researchers and donors need to agree which research questions can and should be answered using experimental research, and, in its absence, to agree what constitutes good evidence."
Three opportunities and challenges for the use of field experiments in the media assistance sector are outlined. Opportunities include the following:
- "In media-scarce environments. For example, limited broadcast range allows researchers to compare people with and without access to the media under investigation. Making use of the common practice of listening to the radio in groups can create similar conditions.
- To test assumptions about media effects. For example, testing unverified assumptions about how media affects democracy and governance can provoke greater reflection about programme goals and theories of change.
- To investigate influences on media. The limited studies in the review have largely investigated media effects; however, there is some research to suggest that institutions affect media quality and, ultimately, democratic development. As yet, field experiments have not been conducted on the factors that influence media content, practices and reach."
Challenges include the following:
- "Level of the intervention. Often media assistance programmes target national broadcasters, but these are more challenging to evaluate with field experiments. Interventions that involve a large number of 'units', such as individual journalists or media outlets, are more amenable to field experiments. Thus experimental evidence will tend to accumulate where a large number of units are available.
- Complexity of the intervention. Media assistance programmes designed to improve governance tend to combine different activities targeted at a diverse range of beneficiaries. This poses significant challenges when designing field experiments, such as difficulty ensuring that a control group is not exposed to any aspect of an intervention.
- Research planning under ambiguity. Programme objectives and activities typically evolve over time. While flexibility can lead to responsive development interventions, it can make it difficult to design an experiment."
Further, two issues may impact conclusions in the wider policy field: "Because it is easier to design field experiments with certain types of interventions, field experiments might fail to address particular questions, and even whole domains, that are of great interest to practitioners, donors and policy-makers."
"The accumulation of experimental evidence from certain types of cases and not others can lead to distorted conclusions about what works and what does not. The parameters of the cases must be taken into account when drawing policy lessons."
BBC Media Action website, June 10 2014.
- Log in to post comments











































