How Can Peacemakers Show Success?
A Peacebuilding Monitoring and Evaluation Solutions Forum
Peacebuilding work matters, but we still struggle to show evidence of where interventions have led to positive outcomes, such as a clear reduction in violence or increased cooperation. The Peacebuilding M&E Solutions Forum is an opportunity for practitioners to come together, network and connect with people working in this space, and share best practices, lessons learned, results, and evidence from across the broad spectrum of M&E activities on peacebuilding programming. This event was co-hosted by the U.S. Institute of Peace and the Alliance for Peacebuilding.
In order to lay the foundation for showing greater evidence, we must overcome the challenge of weak M&E practices in rapidly shifting and complex environments. If we believe that what we do is important and that our work is effective, then we need to prove it to our beneficiaries, donors, and ourselves. The Peacebuilding M&E Solutions Forum sought to begin addressing this need and explore new ways to demonstrate the impact of peacebuilding programming.
Agenda
9:00am: Greetings & Welcome to Event (AfP, USIP, & OEF)
9:10am: Morning Panel
- Improving Research Design (Methodological approaches for evaluations before, during, and after a program is implemented)
- Improving Data Analysis (Data-based Decision Making for Building and Sustaining Peace)
- Improving Data Use (Making Research Useful and Used: Lessons from a Mixed-Method Evaluation in Somalia
- Moderated Q&A
11:15am: Breakout Sessions
- Impact Evaluations & RCTs
- Randomized Control Trial Evaluation of Religious Peacebuilding
- Seriously? RCTs Role in Peacebuilding
- Evaluating the impact mass and social media programming to alter public perceptions
- Qualitative Research
- Measuring the Hard to Measure: Qualitative Rigor
- Capturing Conversations: Guided Dialogue Observation
- Sensemaking in Peacebuilding Evaluation
- Joint Action & Collective Impact
- Designing Collective Impact for Peacebuilding
- Collective Monitoring: Lessons from Central African Republic
- Joint Community Action Plans for Peace-Building in Nigeria
1:30pm: Breakout Sessions
- Adaptive Management & Evaluation
- Adaptive Learning: Outcome Harvesting an alternative to results-based programming?
- Lessons from Experience: Outcome Mapping in Conflict-Affected Contexts
- Developmental Evaluation: Youth and Community Resilience against VE
- Participatory M&E
- Feedback Loops in Inclusive and Participatory M&E
- Everyday Peace Indicators
- PhotoVoice : Amplifying Youth Voices
- M&E Purpose & Quality Assurance
- The Evolution of the Role of M&E in Peacebuilding Programming
- Searching for Balance: From Agency Measurement to Program Specifics
- Data quality assurance: Audio check in complex environments
3:15pm: Breakout Sessions
- Evaluation & Data Use
- Evaluation of Early Childhood development program in Rwanda: Research with Children under age 6
- Promoting and Evaluating Joint Action for Reconciliation Between Indigenous and non-Indigenous Peoples in Saskatchewan, Canada
- Monitoring for Adaptive Programming: Data to Inform
- Conflict Monitoring & Early Warning Systems
- Conflict scan as a quick actionable tool for monitoring conflict
- The SCORE Methodology Measuring the intangible and predicting peace
- Community-driven approach to monitoring conflict and Peace projects: A review of BEWERS Monitoring Strategy
- Theory & Institutional Learning
- From M&E to Institutional Learning: Progress through University Partnerships (Case study in Kurdistan, Iraq)
- Theory-based approaches for evaluative program monitoring
- Evaluating Across Contexts: Lessons from a Meta-Review Process
Morning Panel
Keith B. Ives
CEO, Causal Design
Jon Kurtz
Director of Research and Learning, Mercy Corps
David Hammond
Director of Research, Institute for Economics and Peace
Andrew Blum
Executive Director, Kroc Institute for Peace and Justice
Beza Tesfaye
Senior Researcher, Mercy Corps