Donaldson & Christie 2019 FOUNDATIONS Workshop 8-14-19 (2).ppt
ssuser698f0c
16 views
96 slides
Oct 08, 2024
Slide 1 of 96
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
About This Presentation
This is a foundation of research for monitoring and evaluation expert that are coming in newly to the evaluation field or want some refreshers
Size: 4.85 MB
Language: en
Added: Oct 08, 2024
Slides: 96 pages
Slide Content
1
Foundations of Evaluation and
Applied Research Methods
Stewart I. Donaldson
Christina A. Christie
CGU Professional
Development Workshops
Claremont, CA
August 14, 2019
2
Introduction Exercise
Small Group Discussion
3
l
LaVelle, 2012
5
Overview
Evaluation
Program Evaluation
Applied Research
Applied Research Methods
6
Patton, 2008
7
Evaluation is so much MORE than the
Application of Research Methods
A Positive Social Epidemic?
You Can Run But You Can’t
Hide
Globalization of Evaluation
Growth of Evaluation Professional
Associations
1980s – Only 3 National and Regional
Evaluation Societies
1990 – 5
2000 – More than 50
2015 – Approximately 225 including a
Formal International Cooperation
Network
Examples of Professional
Evaluation Organizations
American Evaluation Association
Canadian Evaluation Society
European Evaluation Society
Australasian Evaluation Society
African Evaluation Association (AfrEA)
IDEAS
International Organization for Cooperation in
Evaluation (IOCE)
EvalPartners
Exemplary Evaluations
15
16
Evaluation as a Transdiscipline
A discipline with a clear
definition, subject matter,
logical structure, and
multiple fields of application.
Applications of Evaluation
•Education
Applications of Evaluation
•Education
•Literacy
Applications of Evaluation
•Education
•Literacy
•Poverty
Applications of Evaluation
•Education
•Literacy
•Poverty
•Prejudice &
Discrimination
Applications of Evaluation
•Education
•Literacy
•Poverty
•Prejudice &
Discrimination
•Conflict
26
Evidence-Based Practice
Highly Valued
Global
Multidisciplinary
Many Applications
27
In God We Trust
ALL OTHERS MUST HAVE CREDIBLE
EVIDENCE
How is this Epidemic Spreading?
Technological Innovations that
Improve Evaluation Practice
Globally Connected,
Technology-Facilitated, Fast-
Paced, Real-Time World
(Patton, 2011)
Claremont Graduate University Summer Workshops, 2009
Webinar with Guests in 5 Different
Geographic Locations
Free E-Learning Certificate
Program
Certificate of the Advanced Study
of Evaluation (distance education)
34
New Online M.S. Degree in
Evaluation & Applied Research
35
36
Core Knowledge Base of
Professional Evaluation
Evaluation Theory
Evaluation Design
Evaluation Methods
Evaluation Practice
The Evaluation Profession
Research on Evaluation
AEA Evaluator Competencies Task
Force 2015
Evaluator Competencies
Evaluator Competencies
Evaluator Competencies
Credentialing
45
THE FEAR FACTOR IN
EVALUATION
Will I
be let
go?
How
much
time will
they
take?
Will they
listen to
me?
Will they
understand
our
program?
What if
we get
ranked
low?
E
V
A
L
U
A
T
O
R
Excessive Evaluation Anxiety (XEA)
Anxiety because of the unknown potential
harms of an evaluation.
Fear of the prospect of a negative evaluation.
49
Evaluation Anxiety
Consequences of XEA
Signs of XEA
Sources of XEA
Strategies for Managing XEA
Psychology of Evaluation
TOWARD A POSITIVE
PSYCHOLOGY OF EVALUATION
Positive Psychology of Evaluation: How to Engage
Stakeholders in a Positive and Productive Process of:
52
Why Evaluate?
…Purposes of Evaluation
Program and organizational improvement
Oversight and compliance
Assessment of merit and worth
Knowledge development
53
Reasons to Evaluate
Determine the need for a program (needs
assessment)
Assist in program planning by identifying potential
program models to achieve goals (needs
assessment/program planning)
Describe program implementation
(monitoring/process)
Determine if goals have been achieved (outcome)
Judge overall benefit of program (relative value and
cost)
54
Evaluation and Applied Research Methods-
The Tools of the Trade
Evaluation & Research Design
Data Collection
Data Analysis and Interpretation
Logic Models
Program Theories
55
Evaluation Questions
How we decide which tools to use and how
and when to use these “tools of trade” is
determined by the key Evaluation Questions
56
Evaluation Questions
Formative: Information for program improvement
Summative: Decisions about program adoption, continuation, or
expansion
Needs Assessment: Determining if (or which) a problem/need
exists
Process or Monitoring: Describe how a program is being
delivered
Outcome: Describing, exploring and determining what occurred
in program recipients or communities
57
Evaluation Purpose:
Formative & Summative vs. Process & Outcome
Formative & Summative
The intention of the
evaluator in undertaking
the study–
to provide information
for program
development or render
a judgment about it
Process & Outcome
The phase of the
program being studied
58
Evaluation Purpose: Formative and
Summative Evaluation
FORMATIVE
Designed to assist in
program development
SUMMATIVE
Designed to assist in
decisions about whether
to continue or end a
program, extend it or
cut it back – Go/No-go
59
Evaluation Purpose: Outcome and
Process Evaluation
OUTCOME
Measures effects,
results, impact on
participants
These can be intended
or unintended
Asks the questions:
What was the impact?
and Compared to what?
Both intermediate and
long-tern
PROCESS
What is going on?
What does the program
do? Are folks doing
what they said they
would?
What produced the
outcomes and Why?
How did/does it work?
How was it
implemented?
60
Evaluation & Research Design
Experimental (Gold Standard?)
Quasi-Experimental
Non-Experimental
61
Data Collection Methods
Quantitative
Observations that lend themselves to numeric
representations (numbers)
Qualitative
Observations that do NOT lend themselves to
numeric representations (words)
Mixed-methods
62
Data Collection Methods
Surveys
Interviews
Focus Groups
Observations
Tests (Assessments)
Document Review
63
Data Analysis and Interpretation
Provides Meaning to the Data
Qualitative: Coding and Themes
Quantitative: Descriptive and Predictive
Standards of Judgment
Communication and Reporting of Results
64
Small Group Exercise
Case Scenario
Determine Appropriate Methods
Rationale for Method Choice
65
What Counts as Credible
Evidence?
66
Experimental Design:
Gold Standard?
Random Assignment
Experimental Control
Ruling Out Threats to Validity
67
The Solution: In Search of a Time
Machine
68
Supreme Courts of Credible
Evidence
What Works Clearinghouse
Campbell Collaboration
Cochrane Collaboration
69
Small Group Exercise
Should RCTs be the Gold Standard for Impact
Evaluation Design?
Pro
Con
70
Challenges of the Gold Standard
AEA Statement vs. Not AEA Statement
Theoretical
Practical
Methodological
Ethical
Ideological
Political
So What Counts as Credible
Evidence (Donaldson, 2008)?
It depends:
Question(s) of Interest
The Context
Assumptions of Evaluators &
Stakeholders
Theory of Practice
Practical , Time, & Resource
Constraints
CDC: Gathering Credible
Evidence
Definition: Compiling information that
stakeholders perceive as trustworthy and
relevant for answering their questions.
Such evidence can be experimental or
observational, qualitative or quantitative,
or it can include a mixture of methods.
Adequate data might be available and
easily accessed, or it might need to be
defined and new data collected.
From Experimenting Society to
Evidence-based Global Society?
From “RCTs” as the Gold
Standard to “Methodological
Appropriateness” as the Platinum
Standard
Donaldson, 2009
74
Roles for Theory in Evaluation
Practice
Evaluation Theory
Program Theory
Social Science Theory
76
Example: Winning New Jobs
Program Theory
Inoculation
Against
Setbacks
WNJ
Program
Job
Search
Self-
Efficacy
Job
Search
Skills
Reemployment
Mental health
77
Program Theory-Driven Evaluation
Science: 3 Key Steps
Develop Program Impact Theory
Develop and Prioritize Evaluation Questions
Answer Evaluation Questions
80
Evaluation Theory Exercise
Small Groups of 5-10
Evaluate the Room from the Perspective
Presented on the Handout
81
Evaluation Theory
Prescriptive (not empirically based)
Guide Practice
e.g., Design, Methods, Breath and Depth
of Stakeholder Involvement
Driven by the Primary Role of Evaluation
82
83
84
Methods for Vision of Future
Framework
Invited Diverse Set of Evaluators
Ask to Give a “Last Lecture”
Visions of “How We Should Practice
Evaluation in the 21
st
Century”
Reactor Panel
Audience Participation
85
Methods for Vision of Future
Framework
86
Visions for the Future of Evaluation
Practice
Social Experimentation - Cook
The Transdisciplinary Vision – Scriven
Empowerment Evaluation – Fetterman
Fourth Generation Evaluation - Lincoln
Inclusive Evaluation - Mertens
Results-oriented Management - Wholey
Theory-driven Evaluation - Donaldson
88
Reconciling Diverse Visions
Argue for Superiority
Toward Integration - Mark
Embracing Diversity – Donaldson
89
Research on Evaluation
Advances Theory and Practice
Links Theory and Practice
Utilization (early work)
Practitioner's Behavior
90
Evaluation Use
How will information be used and by
whom?
Instrumental or knowledge generation
Varying degrees of participation
Intended to increase instrumental use
91
Professional Guidelines
AEA Guiding Principles
Program Evaluation Standards
Evaluator Competencies
Cultural Competence
92
Cultural Competence
Public Statement on Cultural
Competence in Evaluation
Adopted April 2011
AEA Public Statement on Cultural
Competence in Evaluation
A culturally competent evaluator is
prepared to engage with diverse
segments of communities to include
cultural and contextual dimensions
important to the evaluation. Culturally
competent evaluators respect the
cultures represented in the evaluation
throughout the process.
Optimal Cultural Responsiveness
The capacity to engage respectfully,
authentically, and effectively with
diverse people – understanding and
taking into account the impact of
culture on all aspects of evaluation.