Evaluation

Last edited: December 20, 2011

This content is available in

Options
Options

In general, an effective programme evaluation will do much more than simply fulfill grant requirements. Evaluation should shed light on the process of implementing the programme as well as the impact that the programme had on participants and beneficiaries. Evaluation can:

  • Support programme improvements so as to enhance outcomes for women and girls
  • Document programme knowledge for dissemination and applicability in other contexts – how can others learn from programme experiences and apply models to local problems?
  • Support accountability – were resources expended in the manner planned and did the resources contribute to achieving programme goals? Why or why not?
  • Provide information on results for donors and other audiences

The United Nations Development Programme Handbook on Planning, Monitoring and Evaluating for Development Results includes a detailed description of general considerations when creating an evaluation methodology.

Evaluation draws on the data gathered during the monitoring process and will collect final data related to many of the same indicators. As described above, evaluation can compare “with and without” activity in different locations or “before and after” measures in the same location. It is critical to discuss evaluation planning with experts from other organizations, from universities, or with donor technical assistance groups in the planning stages of the programme so that evaluation can be seamlessly integrated into programme activities (OECD/World Bank, 2004).

Data collection methods in evaluations are varied and can include many of the models discussed in the programme planning and design section:

See an overview of the pros and cons of each of these methods for gender equitable evaluation on the UN Women website.

Kenya – Evaluation of Trainings on Sexual Offences Act

The Kenya Women Judges Association (KWJA) conducted an evaluation of its local trainings for stakeholders, known as Court Users, on the Sexual Offences Act and Children Act. The evaluation used a relatively simple, post-hoc model. Training participants from six districts were brought together for a workshop to discuss how they felt that the training had helped or not helped in their work on the ground. The evaluation workshop consisted of a survey administered to participants as well as group discussions about how the trainings had impacted practice. Nevertheless the evaluation provided valuable information to KWJA about its work. The evaluation survey administered to training participants included the following questions:

Name:

  1. Please indicate if you are a participant, observer or any other?
  2. If participant, please state your occupation. For example, Hon. Magistrate, Lawyer, Medical Practitioner, Prosecutor, Investigator, Police officer, Chief,  Gender Officer, Probation officer,  Children officer.
  3. Please state your station of operation, department and rank.
  4. How were you recruited to come to attend the training to be a participant/observer/other?
  5. Before attending this workshop, were you aware of the Sexual Offences Act (hereinafter referred to as the Act)?

- I was very much aware of it

- I had heard about it

- I had never heard about it

6. Had you read the Act before attending the Court Users Committee meetings?

- Yes

- Partially

- No

7. How has this knowledge enhanced your understanding of the Act?

8. How prevalent is Sexual Gender Based Violence in your area of operation

9. How many cases/incidences were reported to you and what has been the outcome?

 10. What challenges, if any, did you encounter in your area of operation and how has the training assist in overcoming these challenges? Please explain.

11. How helpful has the training assisted you in overcoming the challenges faced?

- Extremely helpful
- Helpful
- Not quite helpful
- Not helpful at all

12. Please explain your reason(s) for the above answer.

13. To what extent has your knowledge in the subject of the Sexual Offences Act and Sexual Gender Based Violence improved and increased as a result of the training?

14. To what extent has the training helped to enhance your appreciation and understanding of your job as a whole on the Sexual Offences Act and Sexual Gender Based Violence?

15. How has the training enhanced your expertise and skills in handling Sexual Gender Based Violence cases?

16. Has the Court Users Committee enhanced the coordination of the stakeholders in dealing with Sexual Gender Based Violence cases?

17. Do you think the Court Users Committee is a good tool in handling Sexual Gender Based Violence cases?

- Yes

- No

18. Please explain your reason(s) for the above answer.

19. What were the positive outcomes, if any, in the application of the knowledge, expertise and skills acquired during the training on the Sexual Offences Act and Sexual Gender Based Violence? Please explain.

20. If you have handled a case either as a Hon. Magistrate, Lawyer, Medical Practitioner, Prosecutor, Investigator, Police officer, Chief,  Gender Officer, Probation officer,  Children officer, will you share it with KWJA.

- If yes, forward to this address: 

- If no, please give reasons

21. Did you find the following factors adequate?

- Facilitators                                                           a) Yes             b) No
- Venue                                                                  a) Yes             b) No
- Interactions                                                          a) Yes             b) No
- The materials used                                               a) Yes             b) No
- The approach used during the training                 a) Yes             b) No

22. If not, kindly give your reason(s) below.

23. Please give us your comments on how these programmes can be improved in future

24. Before these trainings, had you ever heard of Kenya Women Judges Association?

- Yes
- No

25. Any additional comments.

The evaluation team then compiled data from these questions into charts. You can review the findings from the KWJA evaluation in the group’s Assessment Report. Also, the feedback discussions provided important insight into what participants valued about the trainings and how they were using the information. The table below provides information from discussions in each of the areas where trainings were held:

Station

Feedback

Molo

  • There was coordination, networking and interaction with other stakeholders.
  • There was also better evidence gathering and preservation, better knowledge of SOA and SGBV cases.

Maralal

  • They have appreciated challenges of other court users
  • Enhanced tactics of solving cases
  • Better skills in soliciting information from those affected and better counseling of spouses and children on their rights.
  • They respond better to these cases because they have the statutes.

Naivasha

  • They have now introduced Gender Desks and have trained officers to run them.
  • They handled sexual offences separately and promptly. Victims give evidence during Plea day.
  • Children are being put in different cells from adults.
  • Chiefs and administrators are more efficient.
  • Police are more informed. Evidence is better preserved.
  • They can counsel victims and take them to safe-houses.
  • There are better investigations.

Narok

  • More awareness from grassroots level and better skills
  • More cases being reported and less cover-up
  • More collaboration among stakeholders
  • More awareness of victims rights
  • Better skills in gathering and preservation of evidence for example DNA
  • More awareness of severity of punishment hence deterrence
  • Imposition of strict bail terms and improved information flow among stakeholders
  • Priority of trial in order to preserve evidence and discourage out of court settlement

Nakuru

  • Better placed to advice parents and victims about preservation of evidence.
  • Awareness in handling the cases as to channels to be followed and the legal requirements involved in those offences

Nyahururu

  • Conscious of the special needs of the victims
  • Further knowledge on how to handle SOA forensic evidence
  • Shared expertise and enhancement of the knowledge of law.
  • Networking with other stakeholders.
  • Identification of loopholes in cases.
  • Impart knowledge in the approach of victims
  • Improved supervision of people/suspect on bond
  • Balanced media reporting not to prejudice the case before conclusion
  • Empowered commanders to train their juniors to be better investigate and prosecute SGBV cases

 

Source: Interview with KWJA Staff, Nairobi, March 2011; Kenya Women Judges Asociation. 2010. Report on the Sexual Offences Act, Sexual Gender Based Violence and Children’s Act Training Assessment. December 2010.

 

When choosing evaluation methods, consider the following:

  • Choose appropriate and relevant methods: Choose data gathering tools based on their appropriateness for different kinds of initiatives. The most effective methodologies are those that are flexible and adaptable, simple to administer, designed to draw meaningful results, and are appropriate and relevant to the intended use and users of the evaluation.
  • Choose methods that are participatory: Participatory methodologies are those that allow all the defined users/stakeholders to submit data and information. Think about the intended respondents and their context when deciding which methods to use. For instance, while online surveys are economical and time-efficient, it is an inappropriate method if the intended respondents do not have regular access to the internet. Make sure that the tools used are accessible to the full range of respondents.
  • Ensure collection of disaggregated data: This is basic to any gender/human rights evaluation. All data gathered should identify the sex of the respondent and other basic data about the respondents that may prove relevant to the evaluation, including: age, ethnicity, nationality, marital status, occupation.
  • Understand the constraints and challenges of informants: Evaluations should be careful to draw out the experiences and input of female respondents/ stakeholders and those of other marginalized populations. Ensure that the methods chosen do not impose any hidden barriers that make the participation of these groups more difficult. For example, the choice of location, timing and language used of the evaluator may all have a bearing of the capacity of particular respondents to participate. Some groups may not be able to express themselves freely because of social pressure or they may not be allowed to speak or be represented in public meetings or community consultations. Women may have less time at their disposal because of their reproductive and domestic duties.
  • Interrogate gender roles: The instruments used should address the gender issues of the initiative or project, and must probe into broader gender issues. For example, in assessing the impact of an ICT training initiative, it is not only important to look into what the trainees have learned but also how they have applied their knowledge in their work or organization. In order to assess this, it is essential to probe into the gender roles within the trainees’ organizations and look at how they are able (or unable) to practice their newly-acquired skills.
  • Be context and culturally sensitive: Group dynamics, subject matter, gender, class, caste, age, race, language, culture, rural/urban issues, etc. greatly influence how effectively and inclusively information is gathered. Evaluations need to be undertaken in a culturally sensitive fashion in order for there to be a full understanding of human rights and gender equality implications. Cultures may be viewed as contextual environments for the implementation of human rights policies. Nevertheless, a human rights perspective affirms that the rights of women and girls to freedom from discrimination and to the highest standard of living are universal. Cultural claims cannot be invoked to justify their violation.
  • Emphasize mixed methods: Use multiple methods to help test, correct and correlate messages and data from different sources of information. In all cases, methodologies should focus on evaluating both the product and the process: what has been achieved so far, and the way it has been achieved as well as how the methods keep evolving. Information on those two aspects reveals much about the social processes at work in any society.
  • To get a complete picture of the social transformation issues and gender issues in a project or initiative requires more than numbers and statistics. Stories, perceptions, observations and opinions are valuable. They give the human dimension behind the statistics – a crucial part to understanding collected data.

Gender Equality and Human Rights Responsive Evaluation (UN Women, 2010). Available in English.  See also the UN Women online guide to gender equality and human rights responsive evaluation in English, French and Spanish.

 

USA – Evaluation Survey of Domestic Violence Program

A final report of an evaluation survey conducted in the US state of Colorado demonstrates how surveys, even of small numbers of people can help provide valuable evaluation data relative to programme components. The Survey Analysis of the Domestic Violence Case Monitor Position revealed the following:

  • Representatives of all respondent groups reported that their practice has changed in a positive way as a result of the DVCM position.
  • All respondent groups rate the sustainability of the position as very important, with many using the word “essential.”
  • Judges, Probation Officers, and treatment providers all report positive change on multiple elements since implementation of the position (DAs were not asked this question; see DA section).
  • While the survey asked respondents to consider the position, not an individual, there was considerable support expressed for the person currently in the position from all respondent groups. “Misty Young must stay and grow old in this position.”
  • Judges are much more confident that offenders are being monitored appropriately.
  • Judges find it more typical to have timely and adequate information at revocation hearings than before the implementation of the position.
  • Sustainability of the position is important to the Probation Department in terms of enhanced communication and provision of effective supervision by the DVCM resulting in fewer cases ending up with Probation.
  • 100% of the treatment provider respondents report a positive impact of the DVCM. Two major areas were most frequently cited: Increased offender accountability; Improved communication between the courts and treatment providers.
  • 91% of the treatment providers report that the position has had a positive impact on their practice.

 

Source: Schwartz. 2006. Domestic Violence Case Monitor Position Survey Results.

 

Sample Evaluation of Domestic Violence Case Coordination Project (Maine, USA): Final Post-Survey

Goals:  To gather information and make recommendations about

A. The sharing of information regarding pending DV criminal and civil cases and orders and the sharing of information among community partners;

B. The coordinated management of related DV criminal and civil cases and orders;

C. Systematic review of offenders’ compliance with court orders and sentencing judgments; and

D. Whether these practices and protocols are improving victim safety and offender accountability.

Background

1. What is your role in this pilot project, and how long have you been involved in the

work you are doing?  

Coordinated Community Response

2. How do you interact and share information with other partners involved with domestic violence cases?

Information-sharing

3. Are you getting the information you need to make informed decisions or provide services that ensure victim safety and/or offender accountability?  If not, what additional information would you like to have?

Effectiveness of protocols

4A. Have the protocols in this pilot project (e.g. providing related DV case information,

relationships developed in the advisory committee meetings, judicial review hearings

presided over by the same judge, participation of probation and BIPs at JR hearings, etc.) made a difference in your ability to serve/respond to/make decisions regarding victims and offenders in DV cases?  

4B. Have they made a difference in terms of victim safety and/or offender accountability?

4C. Can you provide specific examples of the positive impact of the protocols?

Impact of Training

5. Did you attend the January 20 training with the Vera Institute?  If so, did you implement or did you observe any changes in practices or protocols after the training? 

What do you believe or what have you observed to be the impact of those changes, if any?

Suggestions for Improvement

6. Is there room for (further) improvement in what your court is doing with its domestic violence docket?  If so, what kind of changes would you recommend?

Key Practices and Protocols

7. What do you believe are the most important protocols or practices for other Maine courts to consider in developing their own DV docket?  (Refer to “Draft Uniform

Protocols” document as time allows, focusing on sections appropriate to the stakeholders. An alternative is to provide/e-mail the uniform protocols and ask them to e-mail comments.)

Unintended Consequences

8. Have there been any unintended consequences, positive or negative, of the domestic violence docket or of any of the protocols implemented as part of the pilot project?

See the full evaluation.

 

Additional tools on monitoring and evaluation:

A Compendium of Monitoring and Evaluation Indicators (MEASURE Evaluation, 2008). Available in English. 

Gender Justice Observatory (Women’s Link Worldwide).  Available in English and Spanish.

Researching Violence Against Women: A Practical Guide, (PATH/WHO 2005). Available in English.

Rule of Law Tools for Post-Conflict States - Monitoring Legal Systems (Office of the High Commissioner for Human Rights, 2006). Available in English.

The Comprehensive Assessment Protocol: A Systemwide Review of Adult and Juvenile Sex Offender Management Strategies. (Center for Sex Offender Management, 2011). Contains a series of questions to assess prosecutorial practices in sex offender cases. Available in English.

Women’s Initiatives for Gender Justice has released four Gender Report Cards evaluating the effectiveness of the ICC’s implementation of gender justice principles. Read the 2009 Gender Report Card in English.

 

Illustrative monitoring and evaluation reports in the justice sector:

Different Systems, Similar Outcomes? Tracking Attrition in Reported Rape Cases in 11 European Countries (Lovett and Kelly, 2009). Available in English.

Responding to sexual violence: Attrition in the New Zealand Criminal Justice System (Triggs and Mossman, Jordan and Kingi, 2009).  Available in English.

Implementation of the Bulgarian Law on Protection against Domestic Violence (The Bulgarian Gender Research Foundation and the Advocates for Human Rights, 2008).  Available in English.

Judicial System Monitoring Programme (Women’s Justice Unit, Timor-Leste). Reports are available in English, Bhasa and Portuguese.

Tracking Rape Case Attrition in Gauteng: The Police Investigation Stage (Sigsworth, Vetten, Jewkes and Christofides/The Centre for the Study of Violence and Reconciliation, 2009).  Available in English.

Tracking Justice: The Attrition of Rape Cases through the Criminal Justice System in Guateng (Sigsworth, Vetten, Jewkes, Loots, Dunseith and Christofides/The Centre for the Study of Violence and Reconciliation, 2008).  Available in English.