design monitoring and evaluation in humanitarian interventions across globe.pptx

ehsanullahdawary 22 views 34 slides May 28, 2024
Slide 1
Slide 1 of 34
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34

About This Presentation

Professional methodes for monitoring and evaluation. It helps monitoring and evaluation professionals to improve successfully


Slide Content

Welcome to Design, Monitoring & Evaluation Orientation and Training January 26, 2008

2 DM&E: Training Objectives Learn principles of Design, Monitoring and Evaluation Learn basic terminology related to DM&E Understand standard DM&E tools – log frame, indicator plan, work plan, causal and logical diagrams Recognize different types of indicators and their purpose Relate these principles and tools to individual programs Improve capacity! (pre- and post test)

3 The Design, Monitoring & Evaluation Cycle is Common in our Field Program Life Cycle Needs Assessment Project Wrap-Up Ongoing Program Ongoing, Regular Monitoring Conduct Appropriate Evaluation Learning and Sharing Logical Articulation of Design We need to apply this cycle and its elements not just to programs, but to our work in general – keep learning from our experiences. Lessons Learned

4 Elements of Program Design

5 Mercy Corps Uses 3 Primary Tools to Express Program Design Logical Framework – Summary of project logic and indicators Work Plan – Schedule of project activities, targets, and resource allocation Indicator Plan – A plan for detailing what and how we measure These items are usually a part of the proposal process and are used to manage a program.

6 We Also Use 2 More Tools to Express Program Design Causal Diagram – A sketch of the main assumed project relationships Logical Diagram – A structured or hierarchal representation of project causal relationships These are relatively new to Mercy Corps, but are an important part of the process and we hope to incorporate them into all new programs.

7 Program Design: Create a Goal A Goal is a simple, clear statement of the impact we want to achieve with our project . It is the change we hope to bring about in the community. It should answer the question: What are our target communities like if we are successful? What are your programs’ goals?

8 The Goal goes at the top of our log frame, but may be changed as we design our program. Program Design: Create a Goal Entering the Goal into the Log Frame

9 Program Design: Identifying Objectives Objectives are the major changes in a population’s knowledge, attitudes or behavior that will make our goal a reality . Where the goal is the vision of the future, our objectives are the effects that create the change we imagine. Objectives should logically and reliably contribute to our goal. Completely represent the scope & strategy of our work

10 Program Design: SMART Objectives What makes an Objective SMART? SMART Objectives reflect the project’s needs for results and help to identify exactly how the project will reach its goal. SMART Objectives also help in the creation of reliable indicators for good monitoring and evaluation. SMART Means: S pecific M easurable A chievable R elevant T ime-Bound (have a clear beginning and end)

11 Program Design: SMART Objectives Realistically, objectives aren’t always SMART, But it’s good to keep the SMART criteria in mind. If they are well articulated, they already contain the indicators. Note also that EC log frames use different terminology. They have Overall Purpose (Big Goal), Specific Purpose (something between an objective and a goal), and Expected Results. In this case, think of Expected Results as our SMART Objectives.

12 Program Design: Relationship Between Objectives and Goals The Objective or Intermediary Result is an EFFECT on CAPACITY Objective – Intermediary Result - Effect Improved access to education Improved access to water Improved access to food Changed maternal/child health Behavior Increased productivity of land Improved access to credit Increased capacity to solve problems Improved capacity to deliver justice The Goal or Result is an IMPACT on WELL-BEING Goal – Result – Impact Increased literacy Reduced incidence of diarrhea Improved nutritional status Reduced mother and infant mortality Increased farmer income Increased income from business Improved local governance Improved rule of law return

13 Program Design: Outputs Project outputs are the products, sometimes called “goods and services”, that must exist in order to achieve our objective. Usually the result of activities Think of them as “deliverables” A critical assumption exists about the things Mercy Corps can provide (Outputs) and the changes we hope to see in our target communities (Objectives). Examples below highlight these assumptions . Output (good or service) Trainings Conducted Schools Rebuilt School biscuits provided Objective (effect) Participants change behavior Children resume classes More children attend school Assumptions

14 Program Design: Formulate Outputs Identify Outputs for the Log Frame

15 Program Design: Formulate Outputs Example of Log Frame with Outputs

16 Program Design: Major Activities Activities are the daily efforts that result in the desired outputs and subsequent objectives and goals . Focus on major activities for the log frame - management need vs. log frame content CAUTION: Don’t let the activities lead (determine) the Log Frame process

17 Log Frame Puzzle Instructions Teams will receive pieces of Log Frame Puzzle Goal is to fit them together to make a completed Log Frame 20 minutes to complete activity Hints Think carefully both about the logic of the log frame and the characteristics of outputs May want to draw a chart to house the puzzle pieces

18 Log Frame Puzzle “Answer” Key

19 Design Process: Select Indicators Indicators are measures that allow us to verify our program progress . Indicators are often confused with targets (also called benchmarks or milestones by various organizations). What’s the difference? Indicators tell us what we are going to measure Targets help us quantify and set goals or benchmarks for those measures – how much or how many, by what time Can also be confused with SMART objectives, which can often tell you what the indicator is

20 Design Process: Types of Indicators Process – measuring the process or procedure of our work – the output indicators Results – measuring the change occurring as a result of our activities, and moving us toward the achievement of our objective Effects – immediate changes in behavior, skills, knowledge Impact – what those changes mean ( review , review )

21 Program Design: Indicators in the Log Frame How do we account for different types?

22 Program Design Process: Indicators Baseline Data Collection What is a Baseline? A measure of our indicators at the beginning of the program or project Should apply to the indicators we have identified to measure our program Necessary to show improvement over time Very difficult to show results in an evaluation without baseline data How is baseline data different than an assessment? (An assessment is generalized to help identify needs, not based on specific indicators) What baselines have you participated in?

23 Program Design: Review Identify each of the following as either goals, objectives, activities, process indicators (output), or result indicators (effect or impact) “Afghan farmers with improved livelihoods” “provide training on production improvements to farmers” “number of farmers receiving training” “increased livestock production among Helmand farmers by December 2009” “% of mothers obtain improved prenatal care” “% increase in production of wheat” “increase in sales among wheat producers in Helmand” “rebuild irrigation canal for community x” “improved test scores among agricultural students”

24 Discussion of Program Monitoring

25 Program Monitoring What is monitoring? Routine data collection, reflection, and reporting about project implementation that occurs on a regular basis. Generally used to check our performance against expected results or “targets” as well as ensure compliance with donor regulations. What information should we actually monitor? Indicators for the Objectives – process and results Activities and outputs against targets set forth in the Work Plan Quality/technical standards What else?

26 Program Monitoring How do these items get monitored and by whom? Indicators for the Objectives – process and results Indicator Plan – designed by PM with DM&E help Activities set forth in the Work Plan Program management – tracked by program team through regular meetings Quality/technical standards We are suggesting developing monitoring checklists or forms to serve this purpose – we can help, but program staff have to design and perform Financial and compliance Documentation and Program Files

27 Program Monitoring: Using an Indicator Plan The Indicator Plan describes the way we will measure our indicators and should be the first step in developing a monitoring system. Especially useful for complex indicators, like “capacity building”.

28 Program Monitoring: Frequent Monitoring Challenges Several challenges in regular monitoring are: Management challenges in finding the time & resources to conduct good monitoring Lack of clarity concerning what to monitor, how, and who is responsible Mastering data collection techniques Ensuring data and information get to the people who need it for decision-making Multiple reporting formats Others?

29 Discussion of Program Evaluation

30 Evaluation Introduction What is an evaluation? A periodic review of a project (or program) that is intended to measure project outcomes and/or impact. This process serves the twin aims of 1) telling us how we did and 2) learning how to make programs better. Why do we conduct evaluations? Strategic Learning (should be primary motivation!) Management request Donor Request

31 Program Evaluation: Understanding the Evaluation When do we evaluate? MID-TERM FINAL EX POST

32 Program Evaluation: Understanding the “Science” of Evaluation What are some evaluation tools? Surveys Focus groups Review of Baseline and monitoring data (surveys and capacity indices) Observations Participatory evaluation Maps Case studies Document analysis Structured Interviews Semi-structured Interviews Field visits and tours: Transect Walk

33 DM&E: Review Objectives Learn principles of Design, Monitoring and Evaluation Learn basic terminology related to DM&E Understand standard DM&E tools – log frame, indicator plan, work plan, causal and logical diagrams Recognize different types of indicators and their purpose Relate these principles and tools to individual programs Improve capacity! (pre- and post test)

34 Thank You For Your Time! Thanks so much for your participation and attention!