Longitudinal Benchmark: A Real-World UX Case Study in Onboarding by Linda Borghesani and Jaitra Dixit

UPABoston 41 views 58 slides May 17, 2025
Slide 1
Slide 1 of 58
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56
Slide 57
57
Slide 58
58

About This Presentation

This is a case study of a three-part longitudinal research study with 100 prospects to understand their onboarding experiences. In part one, we performed a heuristic evaluation of the websites and the getting started experiences of our product and six competitors. In part two, prospective customers ...


Slide Content

Longitudinal Benchmark:
A Real-World UX Case Study in
Onboarding
Linda Borghesani
Jaitra Dixit
May 9th, 2025

Background
UX Researchers on the Constant Contact Product/Design team.
We collaborate with stakeholders across the company to improve the product
experience of our digital marketing tool.
2
Linda Borghesani
UX Research Manager
Lecturer, Tufts University
?????? [email protected]
?????? linkedin.com/in/lindaborghesani



Jaitra Dixit
UX Researcher

?????? [email protected]
?????? linkedin.com/in/jaitradixit

What this Case Study Offers
3
Framework: A real-world repeatable approach for longitudinal UX
benchmarking.
Lessons – mistakes, wins, and surprises.
Actionable insights you can take to your own onboarding or
benchmarking work.
U
X
P
A

B
O
S
T
O
N

2
5

Problem: Onboarding was fractured across teams with unclear handoffs,
inconsistent labeling, and a disconnect between user expectations based
on what the website promised and what the product delivered.
Case Study:
How a year-long UX experiment reshaped the onboarding experience
4
UXR Impact:
●Aligned expectations across stakeholders.
●Increased empathy and research informed decisions.
●Prioritized high-friction issues.
●First impressions became more trustworthy, increasing confidence in product
fit.
●Optimized onboarding user journey.
U
X
P
A

B
O
S
T
O
N

2
5

Show of Hands
5
Who has ever run a study with more than 50 participants?
Who has tackled a longitudinal study — anything over a week or with
repeat touchpoints?
How many of you have done benchmark research — comparing tools,
flows, or competitors?
Be honest... who’s ever used leftover research credits to do something a
little scrappy — but surprisingly valuable?
U
X
P
A

B
O
S
T
O
N

2
5

The Spark
What we did
●Ran a large-scale, multi-part unmoderated longitudinal study involving
mixed-methods research.
●Tracked 100 users through first exposure, sign-up, and one week later.
●Two researchers watched/coded the sessions, met to reconcile and organize
themes.

6
Product/Design stakeholders engaged UX Research for iterative
design concept testing for product onboarding.
What we learned
●Running a longitudinal study is complex.
●Juggling mountains of qualitative data wasn’t always smooth.
●What emerged was a powerful, repeatable process for understanding user
behavior over time.
U
X
P
A

B
O
S
T
O
N

2
5

Business Goals
7
New Adds
Engagement
Net Adds
U
X
P
A

B
O
S
T
O
N

2
5

It Started with a Scrappy Idea & Expiring UserTesting Credits
8
We had a hunch that onboarding was
broken
That simple question launched a side
project that grew into a year-long study.
We didn’t have a formal plan and didn’t
know how long it would take.
So we decided to look across the full
journey, through the eyes of our users.
What followed was messy, revealing, and
unexpectedly powerful.
U
X
P
A

B
O
S
T
O
N

2
5

The Onboarding Challenge
9
“Proper onboarding isn’t done
to prevent churn; it’s done to
ensure the customer achieves
their desired outcome.”

~ Lincoln Murphy on Customer
Success
Introduce our product to customers so they can learn about the features
of our product and develop their first impressions to set them up for
success
U
X
P
A

B
O
S
T
O
N

2
5

Think about your own product
onboarding experience.

What’s the one thing you’d test
first? How would you test it?

The Problem: Onboarding was Fractured Across Silos
11
U
X
P
A

B
O
S
T
O
N

2
5

Onboarding Experience Snapshot
12
1.Front of Site (FoS) Website
2.Sign up Process
3.First (verification) email
4.In-Product Experience
5.Sales
6.Post Sales
7.Customer Success
8.Customer Support
U
X
P
A

B
O
S
T
O
N

2
5

Where We Started
13
●Onboarding was owned by multiple teams — but no one had the full
picture.
●Never had the opportunity to run a longitudinal study.
●Experienced with unmoderated testing and UX research methodologies.
●Expiring UserTesting credits.
Getting Buy-In
●Leadership gave the green light — but as a secondary project.
●We didn’t know how long it would take — we just knew it mattered.
What Happened
●The study ran for ~1 year (Sept 2021–Aug 2022).
●We shared insights as we uncovered them — not just at the end.
●And yes… this was all before AI tools could make it quicker!
U
X
P
A

B
O
S
T
O
N

2
5

Facts &
Assumptions
Workshop

15
Step 1: Before Anything Else: Challenge Our Assumptions
Brought together 17 stakeholders
from Product, Design, Marketing,
Engineering, and Customer
Success.
●Build a shared understanding of the
end-to-end onboarding experience.
●Surface user pain points and team
blind spots.
●Identify opportunities and critical
moments in the journey.
●Define clear roles and responsibilities.
●Identify facts and assumptions to
reduce biases and identify knowledge
gaps.

We kicked off with a 3-hour
Facts & Assumptions Workshop
Our Goal:
U
X
P
A

B
O
S
T
O
N

2
5

How We Aligned Around the User Journey
16
Mapped journey from first
site visit through product
signup and early usage.
Highlighted where users
hesitated, dropped off, or
got stuck.
Walked the Full User
Path
Reviewed unmoderated
session recordings of
prospects.
Created a shared
JamBoard to document
behaviors, questions, and
moments of friction.
Watched Real Users
Cross-Functional
Collaboration
Marketing shared market
trends and customer surveys.
Sales & User Success
provided qualitative insights
from conversations.
Product contributed
behavioral data and friction
points.
U
X
P
A

B
O
S
T
O
N

2
5

We split up into three groups: Learning, Trying, Sending
Facts and Assumptions Workshop
17
We needed to identify assumptions before designing
U
X
P
A

B
O
S
T
O
N

2
5

Align on the the critical assumptions and most valuable ideas then
validate them
Alignment: Assumption Grid
18
Low Confidence/High Importance = Critical assumptions
High Confidence/Low Importance = Likely a fact
U
X
P
A

B
O
S
T
O
N

2
5

Facts and Assumptions - Impact
19
The Facts and Assumptions workshop brought alignment and helped
identify hypothesis and questions to dig deeper.
Questions that surfaced:
●What information does a customer need from us?
●What information do we need from customers?
●When is the information needed?
●What information are users willing to give us?
●When/Where are critical moments?
●What aspects of the journey impact decision making?
●Are there any differences based on experience level?
U
X
P
A

B
O
S
T
O
N

2
5

Our Fieldwork Begins:

What are competitors
doing right (and wrong)?

Research Methodology: Answering the questions
21
U
X
P
A

B
O
S
T
O
N

2
5

Research Timeline: Sept 2021-Aug 2022
22
U
X
P
A

B
O
S
T
O
N

2
5

Step 2:
Heuristic
Evaluation

Explored our & competitor’s flows to
build foundational knowledge
about onboarding experiences.
●Each researcher documented 4
onboarding flows (Constant
Contact + 3 competitors).
●Focused on the experience of
our target audience.
Step 2
Heuristic Review
24
●Nielsen’s heuristics and
severity ratings.
U
X
P
A

B
O
S
T
O
N

2
5

Heuristic: Visibility of System Status
Finding: Display # of steps and/or progress
indicator/breadcrumbs
●Constant Contact shows step x of 6
●Competitor 1 progressively reveals steps and highlights
the current one
●Competitor 2 has 2 steps and highlights the current one
●Competitor 3 has breadcrumbs showing 6 steps with
names not numbers
Heuristic Review - Example of Severity Scale Ratings
25
●Competitor 4 has indiscreet progress indicator
U
X
P
A

B
O
S
T
O
N

2
5

Heuristic Review Artifacts
26
U
X
P
A

B
O
S
T
O
N

2
5

Insights from Heuristic Review
27
Mapped how
competitors guide
users through
onboarding.
Shared quick wins
with stakeholders.
Assumptions to
Prompt About
Informed what to test
with users.
Actionable Insights
Landscape of
Competitors
Hidden UX
Pitfalls
Exposed issues we
might miss in
unmoderated
testing.
U
X
P
A

B
O
S
T
O
N

2
5

Step 3:
Competitive
Evaluation

Step 3
Competitive Analysis
29
Learn which
onboarding flows
engage users.
Learn how much
personalization is
offered.
Test study plan by
running a leaner
version.
Ran 35 unmoderated sessions
Using 7 live
getting started
flows
Constant
Contact + 6
competitors
With our
target
audience
U
X
P
A

B
O
S
T
O
N

2
5

Appreciated visuals especially
when there a lot of text.
Competitive Testing - Examples
30
Want transparency and to
understand plans and pricing. U
X
P
A

B
O
S
T
O
N

2
5

Competitive Audit of Information Required During Account Creation
31
U
X
P
A

B
O
S
T
O
N

2
5

Insights from Competitive Testing
32
Learned the language
our audience
Comfort
Threshold
Identified which
information users
resist sharing early
User
Language
Actionable
Recs
Shared findings with
stakeholders to drive
quick wins
Primary
Competitor
Chose the strongest
competitor for deeper
benchmarking
U
X
P
A

B
O
S
T
O
N

2
5

Step 4:
The Big Bet

A 100-User Longitudinal
Benchmark Study
U
X
P
A

B
O
S
T
O
N

2
5

Step 4
Longitudinal Benchmark Study
34
Designed a test plan that
could be rerun (on the
same or a smaller scale)
as our SaaS tool evolves
and as a baseline.
Track how onboarding flows
shift over time and across
competitors.
Richer Than
Quantitative Only
Combined quantitative
(ratings, clicks)
and qualitative
(quotes, behaviors)
for full context.
Captures a Changing
Landscape
Built to be
Repeatable
U
X
P
A

B
O
S
T
O
N

2
5

100 participants in session #1 - Week 1.
Reinvited them back for session #2 - Week 2.
35
Each participant completed
two 30-40 minute sessions.
Unmoderated & Blind
Participants & TimelineFormat
Study Overview
U
X
P
A

B
O
S
T
O
N

2
5

Learn / Awareness - Marketing Website Evaluation
36
Explore both marketing websites and evaluate whether they are right for your
business (counterbalanced)
U
X
P
A

B
O
S
T
O
N

2
5

Sign-up Process / Try
37
How many will pick our tool. Why?
Let’s look closer…

●Did they complete all steps?
●Did they understand why the
information was being asked?
●Were they comfortable providing the
information?
●What were the critical moments?
●What are their expectations after
signing up?
●Did experience level matter?
U
X
P
A

B
O
S
T
O
N

2
5

●Got a 100% response rate
●Used it since the last session?
●Explored and complete core tasks.
Commit/Use - Core Tasks
38
U
X
P
A

B
O
S
T
O
N

2
5

Analysis
●Captured post-task Likert ratings to layer quantitative sentiment onto
qualitative themes.
●Analyzed by task times to surface hidden behavior patterns.
●Ran correlations to connect findings with age, role, experience level, and
tenure.
●Applied independent t-tests and Chi-Squared to highlight significant
differences.
39
Each researcher reviewed half the studies and coded responses and
interactions for richer insights
U
X
P
A

B
O
S
T
O
N

2
5

40
Example - Experience 1 Key Metrics
U
X
P
A

B
O
S
T
O
N

2
5

Analysis
41
UserTesting tools helped
analyze large amounts of
interaction data
Countless charts
helped identify trends
and patternsU
X
P
A

B
O
S
T
O
N

2
5

Example: Website Behavior - Time Spent and Clicks
42
●All participants explored both homepages
●Most interacted with navigation menus
Competitor
Competitor
Competitor
U
X
P
A

B
O
S
T
O
N

2
5

Insights from Benchmark Study
43
Faster, more confident
decisions, clearer
messaging and
stakeholder alignment.
Broke silos and built
cross-functional muscle.
UXR
Influence
We became the go-to team
for onboarding insights and
led a facts driven Journey
Mapping Workshop.
Organizational
Wins
Product
Changes
U
X
P
A

B
O
S
T
O
N

2
5

Journey Maps
44
U
X
P
A

B
O
S
T
O
N

2
5

What We Learned
45
Gained deep insights into how people evaluate tools and what helps them
feel confident in their decision.
Discovered key differences between marketing websites and in-product
experiences—especially around clarity, navigation, and trust signals.
Identified critical friction points in onboarding flows that were invisible in
typical unmoderated testing.
Surfaced gaps in content, confusing UI moments, and missing feedback
cues that led to drop-off or hesitation.
Learned from competitors.
U
X
P
A

B
O
S
T
O
N

2
5

Tracking Actionable Insights & Recommendations
46
●Our research got a lot of visibility (in a good way)!
●Small scale/scrappy repositories can work!
U
X
P
A

B
O
S
T
O
N

2
5

Our Impact
47
Unified tone and messaging across marketing and product to remove
disjointedness.
Redesigned the sign-up flow to reduce steps, clarify value, and improve
handoff between Marketing website and Product.
Identified and addressed performance bugs that blocked completion.
Prioritized usability improvements by severity, frequency, and business
impact—grounded in real behavior, not assumption.
U
X
P
A

B
O
S
T
O
N

2
5

Retrospective
48
Take the opportunity when you can; plan scrappy, scale strategically.
Manual qualitative coding is time-consuming; AI accelerates.
Be aware of site changes and A/B tests.
Consider the trade-off between watching every session vs. managing time
and cost.
Longer unmoderated sessions worked surprisingly well delivering rich
quantitative and qualitative insights.
U
X
P
A

B
O
S
T
O
N

2
5

Planning & Execution
Recommendations for Future Studies
49
●Consider running this iteratively on a smaller scale.
○Fewer sessions in batches.
●Decide how often to repeat the study.
●Lean into quantitative questions.
○Rating scales, multiple choice.
●Create an issue tracking sheet.
●Take advantage of AI tools and capabilities, they improve everyday.
U
X
P
A

B
O
S
T
O
N

2
5

●Include heuristic and
competitive audits.
●Consider non-competitors tools
in your benchmarking.
●Involve stakeholders early
and often.
●Tailor outputs to your
audience.
Stakeholder
Engagement
Recommendations for Future Studies
50
Design
Research
U
X
P
A

B
O
S
T
O
N

2
5

Finding and Recommendations Spreadsheet
51
U
X
P
A

B
O
S
T
O
N

2
5

Retrospective Framework
52
What went
well?
What went
poorly?
What
ideas do
you have?
How
should we
take action?
U
X
P
A

B
O
S
T
O
N

2
5

What is one risky assumption
you’ve never validated?

What small scrappy project can
you start next week?

Questions?

What’s one thing
you’re taking away?

Feel Free to Reach Out
55
Linda Borghesani
UX Research Manager
Lecturer, Tufts University
?????? [email protected]
?????? linkedin.com/in/lindaborghesani



Jaitra Dixit
UX Researcher

?????? [email protected]
?????? linkedin.com/in/jaitradixit
U
X
P
A

B
O
S
T
O
N

2
5

Credits & Resources

Credits
●Most of the graphics/icons used in this presentation are taken from
Flaticon.com
●Some of the graphics/images were generated by AI
57
U
X
P
A

B
O
S
T
O
N

2
5

Resources
●https://www.nngroup.com/articles/pure-method/
●https://www.nngroup.com/articles/ten-usability-heuristics/
●https://www.nngroup.com/articles/how-to-rate-the-severity-of-usability-pr
oblems/
58
U
X
P
A

B
O
S
T
O
N

2
5
Tags