Reliability and Validity of questionnaire_cb8865a2a30e75c7c177ba2b2d2622a0.pdf

ataliyaafina 5 views 24 slides Jun 07, 2024
Slide 1
Slide 1 of 24
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24

About This Presentation

RESEARCH METHODOLOGIES


Slide Content

RNB31103 BIOSTATISTICS

VALIDITY AND
RELIABILTIY OF
QUESTIONNAIRE
Y.S. PEK
NURSING PROGRAMME

3

VALIDITY
•How well a survey measures what it is
expected to measure.
4
Not Valid Valid

ASSESSMENT OF VALIDITY
•Validity of a questionnaire is measured
commonly in three ways
–Face validity
–Content validity
–Criterion validity
5

Face validity
•Hasty and not detailed review of survey items by
untrained judges
–Example : Distributing the questionnaires to
untrained individuals to see whether they think
the items look okay
–Very casual and unprofessional method
–Many don’t really consider this as a measure of
validity at all
6

Content Validity
•Subjective measure of how appropriate the
items are, to a set of reviewers who have
knowledge of the subject matter
–Usually consists of an organized review of the
survey’s contents to ensure that it contains
everything it should and doesn’t include
anything that it shouldn’t
–Still very qualitative
–Not an objective method
7

Criterion Validity
•Measure of how well one instrument stacks up
against another instrument or predictor
–Concurrent validity: assess your questionnaire,
against a “gold standard”
–Predictive validity: assess the ability of your
instrument to forecast future events, behaviour,
attitudes, or outcomes
8

RELIABILITY
•The degree of stability exhibited when a measurement
is repeated under identical conditions.
9
Not ReliableReliable

Assessment of Reliability
•Reliability is assessed in 3 forms
–Test-retest reliability
–Alternate-form reliability
–Internal consistency reliability (Cronbach’s Alpha)
10

Test-retest Reliability
•Most common form in surveys.
•Measured by having the same respondents complete a
survey at two different points in time to see how stable
the responses are.
•Usually quantified with a correlation coefficient (rvalue).
•In general, rvalues are considered good if
r0.70
11

Internal consistency reliability
•Applied not to one item, but to groups of items that are
thought to measure different aspects of the same
concept
•Cronbach’scoefficient alpha
–Measures internal consistency reliability among a
group of items combined to form a single scale
–It is a reflection ofhow well the different items
complement each other in their measurement of
different aspects of the same variable or quality
–Interpret like a correlation coefficient (0.70 is good)
12

•It is most commonly usedwhen you have multiple Likert
questions in a survey/questionnaire that form a scale or
subscale.
•Cronbach's alpha simply provides you with an overall
reliability coefficient for a set of variables (e.g.
questions).
Cronbach’s Coefficient Alpha
13

9 May 2022 14

15

9 May 2022 16

RESULTS
17

Kline, P. (2000). The handbook of psychological testing (2nd ed.). London:
Routledge, page 13
George, D., & Mallery, P. (2003). SPSS for Windows step by step: A simple guide
and reference. 11.0 update (4th ed.). Boston: Allyn & Bacon.
18

RESULTS
19

Inter-observer Reliability
•How well two evaluators agree in their assessment
of a variable.
•Use correlation coefficient to compare data between
observers.
•If correlation is statistically significant, the inter
observer reliability is good.
20

Inter-observer reliability
•Cohen’s Kappa can be used when you have two or more
ratters (also known as "judges" or "observers") who are
responsible for measuring a variable on a categorical
scale.
•Cohen's kappa (κ) is such a measure of inter-rater
agreement for categorical scales when there are two
ratters.
•Kappa (κ) values increasingly greater that 0 (zero)
represent increasing better-than-chance agreement for
the two ratters, to a maximum value of +1, which
indicates perfect agreement (i.e., they agreed on
everything).
21

22
From Altman (1999) -adapted from Landis & Koch (1977)

Summary
23

5/9/2022 24
Tags