Donaldson & Christie 2019 FOUNDATIONS Workshop 8-14-19 (2).ppt

ssuser698f0c 16 views 96 slides Oct 08, 2024
Slide 1
Slide 1 of 96
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56
Slide 57
57
Slide 58
58
Slide 59
59
Slide 60
60
Slide 61
61
Slide 62
62
Slide 63
63
Slide 64
64
Slide 65
65
Slide 66
66
Slide 67
67
Slide 68
68
Slide 69
69
Slide 70
70
Slide 71
71
Slide 72
72
Slide 73
73
Slide 74
74
Slide 75
75
Slide 76
76
Slide 77
77
Slide 78
78
Slide 79
79
Slide 80
80
Slide 81
81
Slide 82
82
Slide 83
83
Slide 84
84
Slide 85
85
Slide 86
86
Slide 87
87
Slide 88
88
Slide 89
89
Slide 90
90
Slide 91
91
Slide 92
92
Slide 93
93
Slide 94
94
Slide 95
95
Slide 96
96

About This Presentation

This is a foundation of research for monitoring and evaluation expert that are coming in newly to the evaluation field or want some refreshers


Slide Content

1
Foundations of Evaluation and
Applied Research Methods
Stewart I. Donaldson
Christina A. Christie
CGU Professional
Development Workshops
Claremont, CA
August 14, 2019

2
Introduction Exercise

Small Group Discussion
3

l
LaVelle, 2012

5
Overview
Evaluation
Program Evaluation
Applied Research
Applied Research Methods

6
Patton, 2008

7

Evaluation is so much MORE than the
Application of Research Methods

A Positive Social Epidemic?

You Can Run But You Can’t
Hide

Globalization of Evaluation

Growth of Evaluation Professional
Associations
1980s – Only 3 National and Regional
Evaluation Societies
1990 – 5
2000 – More than 50
2015 – Approximately 225 including a
Formal International Cooperation
Network

Examples of Professional
Evaluation Organizations
American Evaluation Association
Canadian Evaluation Society
European Evaluation Society
Australasian Evaluation Society
African Evaluation Association (AfrEA)
IDEAS
International Organization for Cooperation in
Evaluation (IOCE)
EvalPartners

Exemplary Evaluations

15

16

Evaluation as a Transdiscipline
A discipline with a clear
definition, subject matter,
logical structure, and
multiple fields of application.

Applications of Evaluation
•Education

Applications of Evaluation
•Education
•Literacy

Applications of Evaluation
•Education
•Literacy
•Poverty

Applications of Evaluation
•Education
•Literacy
•Poverty
•Prejudice &
Discrimination

Applications of Evaluation
•Education
•Literacy
•Poverty
•Prejudice &
Discrimination
•Conflict

Applications of Evaluation
•Education
•Literacy
•Poverty
•Prejudice &
Discrimination
•Conflict
•Violent Crime

Applications of Evaluation
•Education
•Literacy
•Poverty
•Prejudice &
Discrimination
•Conflict
•Violent Crime
•Drug & Alcohol
Abuse

Applications of Evaluation
•Education
•Literacy
•Poverty
•Prejudice &
Discrimination
•Conflict
•Violent Crime
•Drug & Alcohol
Abuse
•Human Resources

26
Evidence-Based Practice
Highly Valued
Global
Multidisciplinary
Many Applications

27
In God We Trust
ALL OTHERS MUST HAVE CREDIBLE
EVIDENCE

How is this Epidemic Spreading?

Technological Innovations that
Improve Evaluation Practice
Globally Connected,
Technology-Facilitated, Fast-
Paced, Real-Time World
(Patton, 2011)

Claremont Graduate University Summer Workshops, 2009
Webinar with Guests in 5 Different
Geographic Locations

Free E-Learning Certificate
Program

Certificate of the Advanced Study
of Evaluation (distance education)
34

New Online M.S. Degree in
Evaluation & Applied Research
35

36
Core Knowledge Base of
Professional Evaluation
 Evaluation Theory
 Evaluation Design
 Evaluation Methods
 Evaluation Practice
The Evaluation Profession
Research on Evaluation

AEA Evaluator Competencies Task
Force 2015

Evaluator Competencies

Evaluator Competencies

Evaluator Competencies

Credentialing

45

THE FEAR FACTOR IN
EVALUATION

Will I
be let
go?
How
much
time will
they
take?
Will they
listen to
me?
Will they
understand
our
program?
What if
we get
ranked
low?
E
V
A
L
U
A
T
O
R

Excessive Evaluation Anxiety (XEA)
Anxiety because of the unknown potential
harms of an evaluation.
Fear of the prospect of a negative evaluation.

49
Evaluation Anxiety
Consequences of XEA
Signs of XEA
Sources of XEA
Strategies for Managing XEA
Psychology of Evaluation

TOWARD A POSITIVE
PSYCHOLOGY OF EVALUATION

Positive Psychology of Evaluation: How to Engage
Stakeholders in a Positive and Productive Process of:

52
Why Evaluate?
…Purposes of Evaluation
Program and organizational improvement
Oversight and compliance
Assessment of merit and worth
Knowledge development

53
Reasons to Evaluate
Determine the need for a program (needs
assessment)
Assist in program planning by identifying potential
program models to achieve goals (needs
assessment/program planning)
Describe program implementation
(monitoring/process)
Determine if goals have been achieved (outcome)
Judge overall benefit of program (relative value and
cost)

54
Evaluation and Applied Research Methods-
The Tools of the Trade
 Evaluation & Research Design
 Data Collection
 Data Analysis and Interpretation
 Logic Models
 Program Theories

55
Evaluation Questions
How we decide which tools to use and how
and when to use these “tools of trade” is
determined by the key Evaluation Questions

56
Evaluation Questions
Formative: Information for program improvement
Summative: Decisions about program adoption, continuation, or
expansion
Needs Assessment: Determining if (or which) a problem/need
exists
Process or Monitoring: Describe how a program is being
delivered
Outcome: Describing, exploring and determining what occurred
in program recipients or communities

57
Evaluation Purpose:
Formative & Summative vs. Process & Outcome
Formative & Summative
The intention of the
evaluator in undertaking
the study–
to provide information
for program
development or render
a judgment about it
Process & Outcome
The phase of the
program being studied

58
Evaluation Purpose: Formative and
Summative Evaluation
FORMATIVE
Designed to assist in
program development
SUMMATIVE
Designed to assist in
decisions about whether
to continue or end a
program, extend it or
cut it back – Go/No-go

59
Evaluation Purpose: Outcome and
Process Evaluation
OUTCOME
Measures effects,
results, impact on
participants
These can be intended
or unintended
Asks the questions:
What was the impact?
and Compared to what?
Both intermediate and
long-tern
PROCESS
What is going on?
What does the program
do? Are folks doing
what they said they
would?
What produced the
outcomes and Why?
How did/does it work?
How was it
implemented?

60
Evaluation & Research Design
Experimental (Gold Standard?)
Quasi-Experimental
Non-Experimental

61
Data Collection Methods
 Quantitative
Observations that lend themselves to numeric
representations (numbers)
 Qualitative
Observations that do NOT lend themselves to
numeric representations (words)
 Mixed-methods

62
Data Collection Methods
 Surveys
 Interviews
 Focus Groups
 Observations
 Tests (Assessments)
 Document Review

63
Data Analysis and Interpretation
 Provides Meaning to the Data
Qualitative: Coding and Themes
Quantitative: Descriptive and Predictive
 Standards of Judgment
 Communication and Reporting of Results

64
Small Group Exercise
 Case Scenario
 Determine Appropriate Methods
 Rationale for Method Choice

65
What Counts as Credible
Evidence?

66
Experimental Design:
Gold Standard?
Random Assignment
Experimental Control
Ruling Out Threats to Validity

67
The Solution: In Search of a Time
Machine

68
Supreme Courts of Credible
Evidence
What Works Clearinghouse
Campbell Collaboration
Cochrane Collaboration

69
Small Group Exercise
Should RCTs be the Gold Standard for Impact
Evaluation Design?
Pro
Con

70
Challenges of the Gold Standard
AEA Statement vs. Not AEA Statement
Theoretical
Practical
Methodological
Ethical
Ideological
Political

So What Counts as Credible
Evidence (Donaldson, 2008)?
It depends:
Question(s) of Interest
The Context
Assumptions of Evaluators &
Stakeholders
Theory of Practice
Practical , Time, & Resource
Constraints

CDC: Gathering Credible
Evidence
Definition: Compiling information that
stakeholders perceive as trustworthy and
relevant for answering their questions.
Such evidence can be experimental or
observational, qualitative or quantitative,
or it can include a mixture of methods.
Adequate data might be available and
easily accessed, or it might need to be
defined and new data collected.

From Experimenting Society to
Evidence-based Global Society?
From “RCTs” as the Gold
Standard to “Methodological
Appropriateness” as the Platinum
Standard
Donaldson, 2009

74
Roles for Theory in Evaluation
Practice
Evaluation Theory
Program Theory
Social Science Theory

76
Example: Winning New Jobs
Program Theory
Inoculation
Against
Setbacks
WNJ
Program
Job
Search
Self-
Efficacy
Job
Search
Skills
Reemployment
Mental health

77
Program Theory-Driven Evaluation
Science: 3 Key Steps
Develop Program Impact Theory
Develop and Prioritize Evaluation Questions
Answer Evaluation Questions

78
CDC Evaluation Framework:CDC Evaluation Framework:
6 Key Steps + 4 Standards6 Key Steps + 4 Standards

79
Expert Interviews & Cases

80
Evaluation Theory Exercise
 Small Groups of 5-10
Evaluate the Room from the Perspective
Presented on the Handout

81
Evaluation Theory
Prescriptive (not empirically based)
Guide Practice
e.g., Design, Methods, Breath and Depth
of Stakeholder Involvement
Driven by the Primary Role of Evaluation

82

83

84
Methods for Vision of Future
Framework
Invited Diverse Set of Evaluators
Ask to Give a “Last Lecture”
Visions of “How We Should Practice
Evaluation in the 21
st
Century”
Reactor Panel
Audience Participation

85
Methods for Vision of Future
Framework

86
Visions for the Future of Evaluation
Practice
Social Experimentation - Cook
The Transdisciplinary Vision – Scriven
Empowerment Evaluation – Fetterman
Fourth Generation Evaluation - Lincoln
Inclusive Evaluation - Mertens
Results-oriented Management - Wholey
Theory-driven Evaluation - Donaldson

87
More Evaluation Approaches
Utilization-Focused - Patton
Developmental Evaluation – Patton
Principles-based Evaluation – Patton
Culturally Responsive Evaluation – Hood & Hopson
Realist – Pawson

88
Reconciling Diverse Visions
Argue for Superiority
Toward Integration - Mark
Embracing Diversity – Donaldson

89
Research on Evaluation
Advances Theory and Practice
Links Theory and Practice
Utilization (early work)
Practitioner's Behavior

90
Evaluation Use
How will information be used and by
whom?
Instrumental or knowledge generation
 Varying degrees of participation
Intended to increase instrumental use

91
Professional Guidelines
 AEA Guiding Principles
 Program Evaluation Standards
 Evaluator Competencies
 Cultural Competence

92
Cultural Competence
Public Statement on Cultural
Competence in Evaluation
Adopted April 2011

AEA Public Statement on Cultural
Competence in Evaluation
A culturally competent evaluator is
prepared to engage with diverse
segments of communities to include
cultural and contextual dimensions
important to the evaluation. Culturally
competent evaluators respect the
cultures represented in the evaluation
throughout the process.

Optimal Cultural Responsiveness
The capacity to engage respectfully,
authentically, and effectively with
diverse people – understanding and
taking into account the impact of
culture on all aspects of evaluation.

Donaldson & Associates Free Resources @
https://www.donaldsonandassociates.com/

96
Conclusions
Tags