deeper dive into m&eglobalpdx.org/wp-content/uploads/2018/03/deeper-dive... · session focus...
Post on 08-Jul-2020
3 Views
Preview:
TRANSCRIPT
Deeper Dive into M&E
Global PDX Seminar
March 9, 2018
Kay D. Mattson
Public Health/WASH
Development Consultant
Ashley Emerson
Program Director
Health in Harmony
http://andreiacosta.co.uk/blog/network-aesthetics/evaluation-collaborative-website/
Session Focus
� Follows up on Global PDX Conference session
� Explore why organizations engage in/should engage in M&E
� Dive deeper into some of the nuts and bolts of M&E� Get on the same page
� Explore some methods
� Linked to evaluating your project(s)
� Examples from an evaluator
� Examples from Health in Harmony
� Q&A
Global PDX Conference (Getting on the Road to Better M&E)
Source: Laura Koch Vibrant village
Session Leads Background & Participants Experience
Participant Experience
“Survey Says”
q Kay Mattson (Independent Consultant)
qAshley Emerson (Health in Harmony)
https://www.surveymonkey.com/results/SM-NRFR2KV68/
Session Participant
Survey Results
� Organizations
� M&E staff
� External vs. Internal M&E
� Key focus areas of participants
� Comfort level with logical frameworks
� Experience in “M&E” world
� Desires for this session
Why engage in M&E
Why engage in M&E
The basics
Pic courtesy: Zeynep Karakoca
Why is it important?
Monitoring
Routine tracking of services/activities and program/project performance using input, process and outcome information collected on a regular and ongoing basis e.g. from policy guidelines, routine record-keeping, regular reporting and surveillance systems, observation and surveys, etc.to assess whether your project activities are on track
Evaluation
Episodic assessment of results that
can/may be attributed to program/project activities; it uses
monitoring data and often indicators
that are not collected through routine
information systems. Evaluation
allows the exploration of the causes of failure to achieve expected results on
schedule (method to provide mid-
course corrections if needed). Informs
future programming.
� Process evaluations
� Impact evaluations
M&E Cycle
Needs Assessment
Problem Identification
Baselines/KAPs
Monitoring
Evaluation
Project
Refine
Monitoring
MonitoringThings to consider
� Making a business case for it
� What are you trying to change/address in your project? What is your end goal?
� What is needed to get there and how will you track that along the way?
� How will you know if you addressed it?
� What is realistic for your organization/budget/project?
� The TOC and Logical framework serves as foundation for the above
� Need to collect meaningful information on a regular basis to “measure results” (Many ways to do this)
The Basics
�What will be monitored and why
�By whom
�How often
�Using which tools and methods
�Collecting the “right” data
Project Summary Indicators Means of Verification Risks/Assumptions
Goal
Outcome(s)
Outputs
Activities
Logical Framework Template (Tools4Development)
Barreto (2010)
What does it
all mean?
Input Process Output Outcome Impact
Basic resourcesnecessary to
implement your
project
Project activities Results at the project/program level
Results at the level of the target population
Ultimate effect of the project in the long-term
Policies, people, money, equipment,
etc.
Training, logistics, delivery of goods,
meetings,
Measurement of program activities:
# wells dug,
# teachers trained,
# stoves distributed,
# people trained
Changedbehavior/different
practices: people no
longer using surface
water, men now using
condoms, stoves used in a separate structure
rather than in the house,
Decreased incidence of diarrhea, reduced HIV
rates, reduced
morbidity/mortality,
increased economic
growth, increased high school graduation rates
Intervening Factors—
observed and unobserved
Contemporaneous Events
Financial Data Activity Tracking Targeting/Population Take-up and engagement
Feedback data
How resources are being allocated
How project is being
implemented
Information on the people in your
program
#/% of people doing X or using X in the project
who were
offered/targeted with
service*
Gives you information about the strengths
and weaknesses of the
project
Useful for understanding true
cost of project and for
eventual
measurement of cost
effectiveness
Data on key activities and
outputs (tied to
your Theory of
Change and your
logical framework)
Demographic and other data that
defines who you are
serving and assists in
determining whether
or not you are reaching your target
population
Uses targeting data to assess uptake and
change -
behavior/different
practices: people no
longer using surface water, men now using
condoms, stoves used in
a separate structure
rather than in the house
If uptake is low – more feedback may be
needed to determine
why, as well as how to
improve/increase
uptake. Generally an exploratory process
Five types of Monitoring Data Everyone Should Collect (Goldilocks Toolkit)Critical for learning and accountability
Health in Harmony
Conservation
LivelihoodsHealth
EDH
SDH
Entrep.Gender
•
•
•
•
•
•
Evaluation
EvaluationThings to consider
� Why are you doing it – usefulness*
� Expectations of your donor/funders
� Who is going to do the evaluation (importance of impartiality and independence)
� What’s realistic for your organization/budget/project
� Methods/Data analysis (more on this)
� Sharing and making use of results
� Transparency
� Human subjects, confidentiality, avoiding bias
� Read other project evaluations
� Avoiding pitfalls - Review TORs (if hiring externally)
What’s realistic for your organization and project?
DAC Evaluation Criteria
DAC Criteria for Evaluating Development Assistance
�Relevance
�Effectiveness
�Efficiency
� Impact (and Effects)
�Sustainability
� (Other criteria of interest to your organization –e.g. Equity and Scale-up)
http://www.oecd.org/dac/evaluation/daccriteriaforevaluatingdevelopmentassistance.htm
Strategic vision
Phase 3:
Scale impact/ Open source end game
Phase 2:
Incubate/ test the model
Phase 1:Proof of concept
Phase One/
Proof of concept
Review of ASRI’s 10-
year impact
• 0, 5, & 10 year
surveys (~1400
households)
• GIS satelite data
• Ground truthing
1,500
1,000
500
Today there are 1,200 fewer
logging households than
2007, an 89% decrease
2007 2012 2017
Mothers in Child Bearing Years 2007: 1,291
4
3
2
1
0
Infant
Deaths
Per 100
Households
3.4
1.1
2007 2012
2012: 1,362
ASRI’s Health 10-year impact
0
10
20
30
40
50
60
70
80
2007 2012 2017# h
ou
seh
old
s re
po
rtin
g s
ym
pto
m
Prevalence of disease symptoms in
community
diarrhea fever cough >3 weeks wt. loss
Total Sampling
2007: 1,348
2012: 1,497
2017: 1,500
Health in Harmony
Methods
Methods
Qualitative
� Broad needs assessments
� Key Informant (KI) Interviews
� Beneficiary/Participant/Client interviews
� Observational data
Quantitative
� Baseline/KAP surveys
� Questionnaires/Surveys
� Technology based tools
� Spatial data (GIS)
� Online survey tools
• Using enumerators• Data management methods
(databases, spreadsheets, logs, etc.)
Selection of data collection methods
� What fits your budget/organizational capacities
� How much time do you have to dedicate to the effort
� Is it user friendly
� What fits your population/target area
� Is it a good fit with your project design
� What secondary data is available (timing) (data collection fatigue)
https://www.surveymonkey.com/analyze/a5CGGnnZopVaT3muXsNTTyvohTaS8YmOW_2BdVqvD6CpM_3D&tab_clicked=1
Data, analysis and
evaluation tips
q Think about all the data you plan to collect (for monitoring as well as for evaluation*)
q What do you intend to do with that data
q What questions will it answer
q What will you learn by collecting the data
q What has been used in the sector/area of focus for your work (valid/tested)q No need to reinvent
q SDGs
q Credibility
q Validity
q Think about how you will analyze data from beginning (who, what, when and how). How deep do you want to go (descriptive vs. statistical significance) –
tied to you budget/capacity?
q Think about project evaluation from the beginning – what data do you need
to have (include in your logical framework)*
q What sample size will be neededq Plan ahead – if your project has a timeline (not open ended) when are you
planning to do the evaluation (don’t wait until one month before the end of
your project!)*
q Be realistic
Example –School latrines
Participant involvement
Health in Harmony
Bukit Baka Bukit Raya National Park- 181,000 hectare National Park
- Logging in 14 villages
- Massive health concerns
- Strong community buy-in
- International Animal Rescue orangutan release
site
- Radical listening done
- Baseline survey completed
Radical Listening
Results BBBR
(14 villages):
1. Midwives
2. Sustainable
agriculture
training
3. Teachers/
Books
Manombo Nature Reserve, Madagascar - ~5,000 hectares of
last lowland forest
- Lack of access to
health care
- Health/resource link
- Strong community
and partner desire to
work
together
- Government support
Radical Listening
In Manombo
(9 villages):
1. Increased
Agricultural
productivity
2. Health care
access
Data Collection
1. Baseline surveys
2. Measure interventions & on-the-ground monitoring
3. 3 year repeat survey
4. Satellite data
Phase 3/
Scaling impact:
Open source end game
Share results
Human Subjects, Bias and Confidentiality
Avoiding Bias
More than 50 types of bias identified in epidemiological studies (phew where to begin)! Two categories:
q Information biasqObservation bias
q Recall bias
q Selection biasqUndercoverage (sample is not representative of the overall target
population)
qVoluntary response bias (the people who respond to your survey are volunteers – maybe they have self selected – they want to participate for a specific reason, whereas others do not participate)
q Non-response bias (unwilling or unable to participate)
Ethics and Human Rights
� UN Ethical Guidelines for Evaluation
� UN Code of Conduct for Evaluations in the UN System
� Human Subjects and IRBs
� Confidentiality
� Informed Consent
� Do no harm
Q&A
Resources
� Better Evaluation http://www.betterevaluation.org/en (great resource, includes other logical frameworks, links to trainings
� Design, Monitoring and Evaluation Guidebook – Mercy Corps https://www.mercycorps.org/design-monitoring-evaluation-dme-guidebook
� Evergreen, Stephanie (2017) Effective Data Visualization – The Right Charf for the Right Data, SAGE
� Goldilocks Toolkit - https://www.poverty-action.org/goldilocks/toolkit
� IFRC Project/programme monitoring and evaluation (M&E) guide http://www.ifrc.org/Global/Publications/monitoring/IFRC-ME-Guide-8-2011.pdf
� Measure Evaluations - https://www.measureevaluation.org/resources/tools(also source for Multiple Indicator Cluster Surveys (MICS) and National Demographic and Health (DHS) surveys
� OPEN - Oregon Program Evaluator Network https://oregoneval.org/about/
� Outcome harvesting https://www.outcomemapping.ca/resource/outcome-harvesting
� Tools for Development - http://www.tools4dev.org/
� UNICEF M&E Quick Reference (ha ha its 14 pages!) https://www.unicef.org/evaluation/files/ME_PPP_Manual_2005_013006.pdf
Thank you!
top related