CHOOSING APPROPRIATE QUANTITATIVE RESEARCH DESIGN LESSON # 5
RESEARCH DESIGN: WHAT IS IT? - Overall concept or strategy to put together the components of your study in a logical manner
3 TRADITIONAL CATEGORIES OF RESEARCH DESIGN EXPLORATORY RESEARCH DESIGN DESCRIPTIVE RESEARCH DESIGN CAUSAL RESEARCH DESIGN
EXPLORATORY RESEARCH DESIGN - Use to establish an initial understanding and background information about a study of interest, often with a very few or no earlier studies found relevant to the research study.
DESCRIPTIVE RESEARCH DESIGN -USED TO GATHER INFORMATION ON CURRENT SITUATIONS & CONDITIONS. - HELPS PROVIDE ANSWERS TO 5W1H QUESTIONS OF A STUDY.
DESCRIPTIVE RESEARCH DESIGN - PROVIDES ACCURATE DATA FROM LARGE NO. OF SAMPLES. - OFFERS LOGICAL CONCLUSION & PERTINENT RECOMMENDATIONS.
CAUSAL RESEARCH DESIGN - USED TO MEASURE THE IMPACT OF INDEPENDENT VARIABLE ON THE OTHER VARIABLE ON WHY CERTAIN RESULTS ARE OBTAINED. - CAN ALSO BE USED TO IDENTIFY THE EXTENT AND NATURE OF CAUSE & EFFECT RELATIONSHIP.
EXAMPLES OF DESCRIPTIVE RESEARCH DESIGN SURVEY CORRELATIONAL RESEARCH EVALUATION RESEARCH
SURVEY - USED IN SECURING OPINIONS AND TRENDS THROUGH THE USE OF QUESTIONNAIRES AND INTERVIEWS.
CORRELATIONAL RESEARCH - USED FOR RESEARCH STUDIES AIMED TO DETERMINE THE EXISTENCE OF A RELATIONSHIP BETWEEN TWO OR MORE VARIABLES & TO DETERMINE THE DEGREE OF RELATIONSHIP.
EVALUATION RESEARCH - CONDUCTED TO ELICIT USEFUL FEEDBACK FROM A VARIETY OF RESPONDENTS FROM VARIOUS FIELDS TO AID IN DECISION MAKING OR POLICY FORMULATION.
COMMONLY USED TYPES OF EVALUATION BASED ON PURPOSE FORMATIVE EVALUATION SUMMATIVE EVALUATION
FORMATIVE EVALUATION - USED TO DETERMINE THE QUALITY OF IMPLEMENTATION OF A PROJECT, THE EFFIECIENCY AND EFFECTIVENESS OF A PROGRAM, ASSESSMENT OF ORGANIZATIONAL PROCESSES SUCH AS POLICIES, PROCEDURES, ETC.
EXAMPLES OF FORMATIVE EVALUATION: NEEDS ASSESSMENT PROCESS EVALUATION IMPLEMENTATION EVALUATION PROGRAM MONITORING
NEEDS ASSESSMENT - EVALUATE THE NEEDS FOR THE PROGRAM OR PROJECT PROCESS EVALUATION - EVALUATE THE PROCESS OF EVALUATION OF A PROGRAM.
IMPLEMENTATION EVALUATION - EVALUATE THE EFFECTIVENESS OF EFFIENCY OF A PROJECT OR A PROGRAM. PROGRAM MONITORING - EVALUATES THE PERFORMANCE OF & IMPLEMENTATION OF AN UNFINISHED PROGRAM.
SUMMATIVE EVALUATION - DONE AFTER THE IMPLEMENTATION OF THE PROGRAM. IT EXAMINES THE OUTCOMES, PROJECTS, OR EFFECTS OF THE PROGRAM.
EXAMPLES OF SUMMATIVE EVALUATION: SECONDARY DATA ANALYSIS IMPACT EVALUATION OUTCOME EVALUATION COST-EFFECTIVENESS EVALUATION
SECONDARY DATA ANALYSIS - EXAMINES EXISTING DATA FOR ANALYSIS IMPACT EVALUATION - USED TO EVALUATE THE OVERALL EFFECT OF THE PROGRAM IN ITS ENTIRETY.
OUTCOME EVALUATION - DONE TO DETERMINE IF THE PROGRAM HAS CAUSED USEFUL EFFECTS BASED ON TARGET OUTCOMES. COST- EFFECTIVENESS EVALUATION - COMPARES THE RELATIVE COSTS TO THE OUTCOMES OR RESULTS OF SOME COURSES OF ACTIONS.
POINTS OF COMPARISON EXPLORATORY DESCRIPTIVE CAUSAL Research Approach Unstructured and flexible Formal and structured Highly structured Degree of Problem Identification Not well- defined Variables are defined Variables and relationships are defined When to Use Initial research Often a follow-up to explanatory research Late stage of decision-making
Goals and Objectives Provides insights in a problem Describe situations Explains the cause & effect relationship between variables Sample Size Small non-representative sample Large representative sample Large representative sample Type of Hypothesis Research questions Hypothesis is non-directional Hypothesis is directional Data management and measurements Data may not be statistically measurable Data are statistically measurable Data are statistically measurable