Methods-of-Deta-Processing for Quantitative and Qualitative Research.pdf
baktinecarl
0 views
41 slides
Oct 06, 2025
Slide 1 of 41
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
About This Presentation
This presentation is about methods of data processing
Size: 16.49 MB
Language: en
Added: Oct 06, 2025
Slides: 41 pages
Slide Content
Presented by: Group 4 METHODS OF DATA
PROCESSING de los Reyes, Wendel
Galao, Ingrid
Malabo, Joshua
Darlene Uypala
Barbadillo, Khara
Bombeo, Matt Kervy
Carlem, Mel Audrey
Dadang, Eva Stella
Data processing involves editing, coding,
classifying, and presenting data through charts or
diagrams. It's a series of actions to verify, organize,
transform, integrate, and extract data for
subsequent use, with methods rigorously
documented. According to Calmorin and Calmorin
(2007), data processing translates information
manually or electronically into qualitative form for
research analysis. WHAT IS DATA PROCESSING?
Classification or Categorization Classification is the
process of grouping statistical data into understandable
and similar categories for easier interpretation. It is
needed when data is diverse so that it can be presented
and analyzed meaningfully. Good classification should
be uniform, clear, accurate, purposeful, and
homogeneous. (https://www.mbaknol.com/research-
methodology/methods-of-data-processing-in-
research/). STEPS IN DATA PROCESSING 1.CLASSIFICATION OR CATEGORIZATION
Coding of data is more useful with research
instruments of open-ended questions (Calmorin &
Calmorin, 2007). Coding is necessary for efficient
analysis and through it several replies may be
reduced to a small number of classes which contain
the critical information required for analysis.
Coding decisions should be taken at the designing
stage of the questionnaire. STEPS IN DATA PROCESSING 2. CODING OF DATA
The coding process involves several steps:
1.Studying responses (or preparing interview
schedules for precoded questions),
2.Developing a coding frame with possible answers
and code numbers/symbols,
3. Fitting answers to the coding frame, and
4.Transcription, where information is transferred to
a summary sheet (transcription sheet) containing
answers/codes for all respondents.
A coding frame is a set of explicit
rules and conventions that are used
to base classification of observations
variable into values which are which
are transformed into numbers.
Tabulation of data is the process of organizing and
summarizing raw information into compact tables
for easier analysis. It is an important step because
it helps present data in a clear and systematic way.
Tabulation can be done manually, mechanically, or
electronically, depending on the size and type of
study, costs, time constraints, and availability of
computer programs. STEPS IN DATA PROCESSING 3. TABULATION OF DATA
There are two main types of tabulation:
Simple tabulation, which shows information
about one variable or independent question,
and;
Complex tabulation, which displays data
involving two or more related variables to
show their relationships. STEPS IN DATA PROCESSING 3. TABULATION OF DATA
Diagrams are charts and graphs used to present
data. These facilitate getting the attention of the
reader. These help present data more effectively.
Data diagrams can include elements like color,
labels, and annotations to highlight key
information. They are essential tools for
simplifying complex data sets and making them
accessible to a wide audience.STEPS IN DATA PROCESSING 4. DATA DIAGRAMS
Data analysis is the process of answering research questions
through examining and interpreting data. It involves
presenting results clearly using graphs and tables,
interpreting findings, providing implications, and citing
related studies to support claims. It is essential in
understanding results from surveys, administrative sources,
and pilot studies, as well as identifying data gaps, redesigning
surveys, planning new activities, and formulating quality
outcomes. SCOPE AND PURPOSE OF DATA ANALYSIS
The data presentation summarizes respondents’ background
information and collected data through literature, surveys,
interviews, and observations (Samo, 2010). Interpreting data
aims to make it intelligible so that research problems can be
studied, tested, and conclusions drawn. Survey data can be
used for descriptive or analytic purposes, while
interpretation assigns meaning to the findings and identifies
their significance and implications (Tania, 2014). SCOPE AND PURPOSE OF DATA ANALYSIS
Quantitative approaches are useful when summarizing
repeated participatory processes, such as focus group
discussions that generate seasonal calendars and venn
diagrams (Abeyasokera, n.d.). Since quantitative research
produces large amounts of data, organization and
summarization through statistical tools are necessary.
Spreadsheet software and statistical programs, such as Excel
and Access, are commonly used, with Excel being widely
accessible (Wilder Research, 2009).SCOPE AND PURPOSE OF DATA ANALYSIS
2. Questions
Identifying specific, measurable, and relevant
questions that the analysis aims to answer, which will
guide the data collection and analysis process.
1. Purpose of the Evaluation
Clearly defining the objective and scope of the analysis,
including what problem or question is being addressed
and what insights are needed. KEY COMPONENTS OF DATA ANALYSIS PLAN
4. Analysis Technique
Selecting the appropriate statistical or analytical
methods to apply to the data, including data modeling,
data mining, or machine learning techniques, to answer
the research questions.
3. What You Hope to Learn from the Question
Outlining the expected outcomes or insights that the
analysis will provide, including what specific
information or knowledge will be gained.KEY COMPONENTS OF DATA ANALYSIS PLAN
5. How Data Will Be Presented
Determining the most effective way to
communicate the findings and insights, including
the type of visualizations, reports, or dashboards
that will be used to present the results to
stakeholders.KEY COMPONENTS OF DATA ANALYSIS PLAN
Research interpretation is the adequate
exposition of the true meaning of the
material in terms of the study’s purpose
(Reyes, 2004). After data are gathered and
tabulated, the researcher must analyze them
by planning key evaluation questions and
strategies that match the research problem. DATA INTERPRETATION
Reyes (2004) emphasized that interpretation
is closely linked with analysis, making it a
special aspect of the process. Findings should
be presented objectively, concisely, and often
with graphs, tables, and charts to clarify
results. These non-textual elements must
support, not replace, the written explanation.DATA INTERPRETATION
Results and discussions should be systematic,
logical, and comprehensive, blending findings
with the literature review and theoretical
framework. They are written in the present
tense, unlike the methodology, which uses
the past tense.DATA INTERPRETATION
1. Revisit the main and sub-problems.
Thoroughly review the original research
questions to guide the data analysis.
2. Describe the data.
Provide a detailed overview of the dataset's
source, collection methods, and key
characteristics.STEPS FOR DATA INTERPRETATION
3. Plan for an appropriate way to present the
data collected through tabular form,
graphical, or any other way.
Strategize effective presentation methods,
such as tables or charts, to highlight key
patterns. STEPS FOR DATA INTERPRETATION
4. Plug in additional information.
Incorporate external sources or literature to enrich
the interpretation and provide a broader perspective.
5. Have closure or concluding statement in every data
interpretation.
Summarize the main insights and conclusions,
offering a final statement on the findings'
significance.STEPS FOR DATA INTERPRETATION
1. Keep a Master Copy of Data
Store the original dataset safely.
Use a duplicate file for editing, coding, and
calculations.
2. Tabulate the Data
Count responses for each option (e.g., Yes/No,
rankings, ratings).
Present totals clearly in tables or charts. BASIC ANALYSIS OF QUANTITATIVE INFORMATION
3. Compute Averages for Ratings/Rankings
Calculate the mean score for each question.
Example: “The average rating for Question 1 was 2.4.”
Averages are more insightful than just frequency
counts.
4. Show the Range of Responses
Report distribution of answers, not only averages.
Example: 20 respondents chose “1,” 30 chose “2,” 20
chose “3.”
Helps show variation and spread in opinions.BASIC ANALYSIS OF QUANTITATIVE INFORMATION
There are four main stages in the analysis
and interpretation of qualitative
information. These are discussed in more
detail in several textbooks including Patton
(1986, 1990), Miles and Huberman (1994).
Here, the researcher shall concentrate more
on the practical tasks, rather than on
theoretical issues. STAGES OF ANALYSIS AND
INTERPRETATION OF FINDINGS
1. Organize & Prepare Data
(collect, transcribe, arrange) “The interview results were
transcribed…” (this is the preparation of raw qualitative data)
2. Reduce / Code Data
(identify themes, categories, statistics) Moving from raw
interview transcripts to “basis for discussion” (implied
reduction/coding of qualitative answers.4 MAIN STAGES OF ANALYSIS AND INTERPRETATION OF FINDINGS
PATTON (1986, 1990); MILES AND HUBERMAN (1994); SILVERMAN (1994)
3. Display Data
(tables, charts, graphs, software output) “Data analysis procedures
involved the use of Minitab … statistical formulas … frequency,
percentage, mean, standard deviation” (this is the data display/summary
stage)
4. Draw & Verify Conclusions
(interpret, generalize, discuss) “…used as basis for discussion of the
results… allows generalizations of results…” (this is the
interpretation/conclusion stage)4 MAIN STAGES OF ANALYSIS AND INTERPRETATION OF FINDINGS
PATTON (1986, 1990); MILES AND HUBERMAN (1994); SILVERMAN (1994)
SAMPLE RESULTS AND DISCUSSION
Out of 292 respondents, 275 (94.2%) had been enrolled in
some level of schooling, while 17 (5.8%) never attended. Among
them, 42 (14.4%) reached kindergarten and 180 (61.6%) attended
elementary grades. Many children in higher elementary levels left
school early to work, limiting their chances of continuing to high
school. A total of 182 parents only attained elementary education,
which may explain their lack of motivation to send children to
school. Studies show that poor literacy and numeracy among
parents strongly affect children’s education (Bynner & Parsons,
1997; Moser, 1999). SAMPLE RESULTS AND DISCUSSION
As shown in Figure 1, very few parents completed college,
and more pursued vocational training after elementary
rather than moving on to high school. Other findings
revealed that poverty, the presence of multiple school-
aged children, and family separation influenced schooling.
In many cases, children lived with their mother after
separation. According to Amato and Keith (1991), school
dropout and delinquency are common negative outcomes
linked to these conditions. SAMPLE RESULTS AND DISCUSSION
BASIC STATISTICAL TOOL
In the Information Age, the challenge is managing
and interpreting the vast amount of available data
(Dillard, n.d.). Statistical tools are used to organize,
simplify, and analyze data for accuracy and
precision. They help in comparing sets of data,
reducing bias, and providing objective estimates to
determine whether changes have occurred.
BASIC STATISTICAL TOOL
Statistics are not an end but a tool to make
sense of data. Without statistical treatment, raw
data may be misleading. Statistical tests allow
researchers to compare groups, determine the
probability of differences, and validate
hypotheses (Sarno, 2010).
1. The Arithmetic Mean
The arithmetic mean (average) is the sum of all values divided by
the number of values. It represents a central value in a dataset.
Example: If five students scored 80, 85, 90, 75, and 70, the mean
score is (80+85+90+75+70)/5 = 80.
2. Frequency Distribution
A frequency distribution is a table that shows how often each value
or range of values occurs in a dataset.
Example: Scores grouped as 70–79: 5 students; 80–89: 10 students;
90–100: 3 students.COMMON STATISTICAL TOOLS
3. Pie Chart
Represents the
percentage of a
category. A circular
chart divided into
sectors, each
displaying the relative
size of a category. Frustation Level
45.5%
Enhancement Level
31.8%
Independent Level
22.7% COMMON STATISTICAL TOOLS
Male Female
Science Math English
0
5
10
15
20 4. Bar Chart
Equal to the frequency
(number of observations) in
a category. A chart with
rectangular bars of lengths
usually proportional to the
magnitudes or frequencies
of what they represent.COMMON STATISTICAL TOOLS
5. Standard Deviation
Standard deviation (σ) measures the spread of data
around the mean. A high standard deviation indicates
data is widely spread, while a low standard deviation
means data is closely aligned with the mean. It helps
determine dispersion, consistency, and distribution of
responses, providing a better understanding of data
when used with the mean. (Tania, 2014)COMMON STATISTICAL TOOLS
6. T-Tests
T-Tests are used to test if the difference of means is statistically
significant. It tests if the sample is representative of the
populations. Also commonly called t testing, hypothesis testing
assesses if a certain premise is actually true for your data set or
population. In data analysis and statistics, you consider the
result of a hypothesis test statistically significant if the results
couldn't have happened by random chance. Hypothesis tests
are used in everything from science and research to business
and economic. COMMON STATISTICAL TOOLS
7. Pearson (r) Correlation
Pearson (r) Correlation is used to find a correlation between at least two
continuous variables. The value for such a correlation lies between 0.00
(no correlation) and 1.00 (perfect correlation).
8. Chi-square Test
There are two types of Chi-square test but both involve categorical data.
One type of chi-square test compares the frequency count of what is
expected in theory against what is actually observed. The second type of
chi-square test is known as a chi-square test with two variables or the
chi-square test for independence.COMMON STATISTICAL TOOLS
Statistical treatment of data refers to the methods and
techniques used to analyze, interpret, and present data in
research. It ensures that raw data is organized, summarized, and
transformed into meaningful information.
The “Statistical Treatment of Data” section is where you tell the
reader which statistical tools you’re going to use (and sometimes
why you chose them). It’s not where you show the results yet —
it’s where you explain your plan or procedure for analyzing the
data. STATISTICAL TREATMENT OF DATA