Saltar al contenido principal

Multi-agency, Multi-country, Multi-donor (3M). Learning from a 3M Evaluation on What Works for Children and Youth in Peacebuilding

Países
Mundo
+ 3
Fuentes
ConnexUs
Fecha de publicación
Origen
Ver original

Author, Copyright Holder: Ella Duncan, DME for Peace

It seems just about every peacebuilding INGO and funder is talking the talk on collaboration and participatory methods these days, but what about walking the walk? A recent 3M evaluation (multi-agency, multi-country, multi-donor) in Colombia, Democratic Republic of Congo (DRC), and Nepal holds many lessons for how to walk the walk of collaboration and participation with children and youth in peacebuilding evaluation.

DME for Peace recently spoke with the two head evaluators of this evaluation, Claire O’Kane and Dr. Michael McGill. Read on to learn more about their methodologies, associated ethical approach, and their lessons learned about the best ways to engage children and youth in evaluation processes.

Background:

In July 2014, the Global Partnership for Children and Youth in Peacebuilding initiated a multi-agency, multi-country, multi-donor (3M) evaluation in Colombia, Democratic Republic of Congo (DRC) and Nepal to: 1) Map who is doing what and where to support child and youth peacebuilding (CYP), 2) Nurture durable partnerships increasing CYP quantity, quality, and impact 3) With children and youth, assess the quality and impact of child and youth participation in peacebuilding and variables influencing CYP impact; 4) Build the capacity of children and youth to meaningfully participate in CYP evaluations; and 5) Present key findings and recommendations to stakeholders to help increase the quantity, quality, and impact of CYP work.

The 3M evaluation was overseen by a Global Steering Team consisting of representatives from World Vision International, Save the Children, United Network of Young Peacebuilders (UNOY Peacebuilders), Search for Common Ground (SFCG) and Peace Action Training and Research Institute of Romania (PATRIR). Two Global Evaluators worked with the Global Steering Team to design the evaluation methodology and to encourage the formation of Country Partnerships. The evaluation methodology supported a participatory evaluation process involving children, youth, and adults as evaluators in Local Evaluation Teams (LETs). The evaluation was primarily qualitative. A multi-method approach was applied, including focus group discussions (FGDs), using participatory evaluation tools with different age groups, online mapping, interviews, drawing, stories, and analysis of available secondary data. In particular, visual participatory evaluation tools including a Timeline, a before and after Body Map, and other tools were applied.

How did you choose your methodologies for this evaluation?

Claire: Our number one concern was having a framework and process to engage children and youth as very active evaluators. This intent informed the methods that were chosen. Methods were chosen that could be easily explained, and that children and youth could explain to their peers. This led us to adapt and use participatory evaluation methods, particularly visual participatory tools.

Michael: We also had to consider our limitations. Our evaluation included children and youth 10 years and older, we would have liked to also look at younger age ranges, but had to make choices to narrow down the tools. However, we believe that younger children can also contribute in meaningful ways to peacebuilding, but that would have required different methodologies. The design of each tool was impacted by what was developmentally appropriate for the age and evolving capacity of children we were engaging with.

Claire: We built in flexibility and consistency to the tools. We built in flexibility in the development of the methodology by bringing our tools to capacity building workshops at the outset, using that opportunity to test and make sure that the tools were relevant. The consistency was in the application of tools in and between contexts.

It was difficult to find indicators for the impact of child and youth participation in peacebuilding that could be easily applied across different contexts. However, we were able to identify and develop indicators for evaluating the quality of child and youth participation in peacebuilding. The quality indicators drew from the General Comment on the Right to be Heard by UN Committee on the Rights of the Child from 2009, and the Guiding Principles on Young People’s Participation in Peacebuilding. Thus, indicators for quality of process were easier to identify than indicators for outcomes, but there were common themes that could be replicated for other projects, or even rolled into baselines for projects that build on this work.

Michael: Some additional insight into our process for creating common indicators, we worked from both sides – ahead of time to isolate variables and indicators, and then also after the fact to review what themes evolved that could be seen as indicators within and between contexts.

What were your lessons on the best ways to engage children and youth in evaluation processes?

Michael: Undertaking a participatory evaluation process was not the easiest way to conduct this evaluation, but certainly a very good way to do so. Over 120 evaluators were involved in the process. It took a lot of energy, time, money, and communication to manage all those evaluators in the process in ways that increased and built on their capacity as evaluators. It was worth it because it allowed such richness and depth to the analysis, but it’s important to remember the benefits were costly.

A good way to think of the necessary resources to add activities to something of this scale is by imagining a Pyramid. If someone says, “Let’s add just one more block to the base” – it seems like a small ask, but then that addition needs to be continued all around the base and built up to the top of the pyramid. Similarly at the beginning of an evaluation, when you say you are adding just “one more question,” you are adding translation time, contextualizing time, training time on how to ask the question, time to ask all participants, collecting data, analyzing... And all the seemingly small components creates huge amount of work that ultimately creates a tension with the rest of the work you are doing.

Claire: I definitely agree. You always needed more time. Time is crucial to get meaningful agency and partnership buy in. Time for countries to identify and elect young people with the interest, motivation, and capacity to join a peacebuilding evaluation process. Time to train children and youth participants. Time enough for children and youth participants to really engage when they are available, weekends and school holidays. And still more time for analysis to go deeper into findings.

Claire: It was mentioned before, but it bears repeating, using very participatory visual methods were key. And we learned an important lesson to use as simple language as possible. And putting the proper resources and checks to make sure tools are understood across contexts.

What does the future hold for this project?

Michael: A component of the project was an interactive map (www.GPCYP.com/map) that compiles (present tense!) the shared knowledge of the wider peacebuilding community on the holistic patterns of conflict risk and the locations of peacebuilding actors working to address those risk factors. It shows data from a range of participating initiatives. The map allows quantifiable data on children and youth peacebuilding inputs (age range, number of participants, types of activities), and with enough input, hopefully they could get to a scale that allows for greater analysis of correlation between inputs and impact.

We encourage any agency that has conducted or is conducting child or youth peacebuilding activities to login to the mapping platform and add their work. As more agencies do so this map will become a fantastic tool for analysis and for helping others find child and youth peacebuilding efforts to support.

This evaluation has been a wonderful and powerful achievement, but we would certainly like to continue the analysis, as now people and communities have been trained, and so data collection could be more easily continued and even expanded. The Global Partnership for Children and Youth in Peacebuilding is interested in welcoming new partners.

Learn More

Read the 3M Evaluation.

Watch a recording from February 3, 2015, the Washington Network on Children and Armed Conflict (WNCAC) hosted a panel discussion on Children and Youth in Peacebuilding: Evidence based approach to breaking cycles of violence featuring the 3M evaluation team.

Read about the 3M evaluation participatory process in Nepal.

Claire O’Kane

Claire O’Kane is an international child rights consultant who is passionate about children and young people’s participation. She is qualified social worker with more than 20 years of international experience in child rights, participation, citizenship, protection, care and peacebuilding work in development and emergency contexts especially in the Asia and Africa regions. Claire was one of the lead evaluators for the 3M process.

Dr. Michael McGill

Dr. Michael McGill holds a BA in Communications, a MA in counseling/ psychology and a PhD in intercultural studies. His research focuses on child and youth participation in peace processes. He has worked as a child psychologist, invested time in 50 countries and devoted over 15 years to developing international partnerships and collective impact initiatives. He currently serves as the founding Director of the Young Peacebuilders (www.YoungPeacebuilders.com) and was one of the lead evaluators for the 3M process.