Saturday, 4 January 2014

Evaluations identified for the first coding round

One thing that is special about our approach is that we do not only apply established quality standards to the evaluations we review. Instead, we will look into evaluation effects as well. Whether or not an evaluation has to fulfil established quality standards to produce positive effects is an open research question. To answer it, we have to include in our review evaluations that vary in the degree to which they fulfil certain methodological standards. We hope that our research will shed light on the factors that contribute to negative and positive evaluation effects.

We initially cast a large net, searching for any evaluations on work related to violence against women and girls.  A first, cursory exam of the reports we netted showed that summaries tended to contain too little information on evaluation approaches and methods.  Therefore we decided to work with full evaluation reports only.

We found 140 such reports. In many reports that included VAWG as a secondary component (e.g. evaluations of multi-sector country programmes, reproductive health initiatives and humanitarian aid), VAWG-related work tended to occupy a marginal position. Analysing those reports could yield useful information on the quality and effects of evaluations in general – but our focus are evaluations that are specifically designed for interventions on violence against women and girls.

In a further step, we narrowed down our set to reports completed in 2008-2012, excluding evaluations produced in 2013. This is because we will question (through interviews and a web-based survey) evaluation stakeholders about the effects the evaluation has produced. To make sure we can take into account effects that occur after an evaluation, we must allow for some time. One year seems a reasonable time-frame, even though we realise that some effects often occur at a later stage (for instance, the use of ‘lessons learned’ published in an article).

Of the 140 full evaluation reports, we have excluded 16 because they fell outside the 2008-2012 period, 43 because they evaluated interventions which included VAWG as a minor component, and 6 because both exclusion criteria applied. (One report did not show the year of publication.)
The remaining set includes 74 evaluations of VAWG-related interventions in low- to middle-income countries. This is the full set of evaluations we have found to meet all our criteria – i.e. we do not draw any sample. The evaluations cover three different contexts – development, humanitarian and conflict / post-conflict, and the four strategic priorities that inform DFID’s work on violence against women and girls (see figure below). Figures refer to the number of evaluations that match the criteria; the total exceeds 74 because some evaluations match several criteria.

DFID priorities
Development
Humanitarian
Post-/Conflict
Building political will and institutional capacity
22
7
12
Changing social norms
37
2
6
Empowering women and girls
12
3
3
Providing comprehensive services
16
7
7
The evaluations deal with a broad spectrum of interventions of varying complexity carried out by public and not-for profit actors (including women’s rights organisations), ranging from a single training project to multi-country programmes that bring together different types of interventions.  Most evaluations found have occurred near or after the end of an intervention, a smaller number are mid-term reviews.
The reports vary in size (8-258 pages); their median length is 52 pages (average length: 62). The degree to which they fulfil established quality standards (with regard to the methodology employed, protection of VAWG survivors and other aspects) is assessed in the first coding round. What can be said at this point is that quality, understood in this way, appears to vary significantly. This is also true for the appearance of the reports.
All published reports we have identified will be shared with DFID. 19 out of the 74 reports are unpublished or of uncertain publication status. We cannot share these reports with others, but we have obtained permission to extract data from these reports. It is important to keep them in the set of evaluations to be reviewed, as this is an opportunity to work on material that is not easily accessible to a wider public. 
For those who would like to take a peek at the published evaluation reports: the reports can be retrieved via this link. The link takes you to a folder which includes our full scoping report and a brief guide to the folder. 

No comments:

Post a Comment

Note: only a member of this blog may post a comment.