DSC DACH 25 - Nina Mrzelj & Sumeyye Suri.pdf

DataScienceConferenc1 6 views 27 slides Oct 24, 2025
Slide 1
Slide 1 of 27
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27

About This Presentation

"Everyone is talking about using AI, and FOMO creeps in, you want to use it too. But… for what, exactly?

As the hype and interest around AI grow, so does the number of AI projects that seem to exist just because they involve AI. In this session, we will skip the buzzwords and give you practi...


Slide Content

© 2025© 2025, Sclable Business Solutions GmbH
Hello!
15. October 2025

© 2025
2

© 2025© 2025, Sclable Business Solutions GmbH
How to Choose the Right AI
Use Cases and Survive the
Wrong Ones
15. October 2025

© 2025
4
Dr. Sümeyye Suri
Data Scientist
@ sclable

Nina Mrzelj
Data Science Lead
@ sclable

© 2025
5
The AI reality gap
Ambition ≠ Impact
98% 5%
of leaders report
increased urgency to
deliver on AI
of AI use cases make
it into production and
deliver value

© 2025
6
The majority of AI failures are not
technical — they’re strategic or
organisational.

© 2025
7
What AI really does (and doesn’t) do
finding patterns
and making
predictions
a silver bullet
for unclear
problems

© 2025
8
Does this problem need AI?
What is the business
problem?
Can it be solved with simpler
tools?
How will we measure
success?
Do we have the right data
and people?

© 2025
9
Evaluate value
Business
Strategy
Customer / User

© 2025
10
Evaluate value
Business
Strategy
Customer / User
How large is the estimated business value?
Additional revenue, reduced costs, reduced time, …
How large is the contribution to strategic goals?
Reduced risk, sales growth, new clients, sustainability, …
How big is the influence on customer/employee engagement?
How important is it in connection with user satisfaction?

© 2025
11
Evaluate effort
Data
People
Technology

© 2025
12
Evaluate effort
Data
People
Technology
Is the needed data defined and available?
Is it accessible in the right quality?
Can we get the stakeholders on board with management support?
Do we have the right people & skills to tackle the use case?
Are the required tech resources known and available?
Is the technology well-known? Or is there a substantial risk?

© 2025
13
Value vs Effort matrix
Value
Effort- +
-
+
Value vs Effort

© 2025
14
Value vs Effort matrix
Value
Effort- +
-
+
Value vs Effort
We want to
be here :)

© 2025
15
AI projects live in a messy,
unpredictable world.

© 2025
16
Fail often, fail fast
Recover smart

© 2025
17
What to keep in mind?
AI projects are iterative, not linear.
Data quality drives feasibility.
Stakeholder alignment has to happen often.
Choose experimentation over one expensive model.

© 2025
18
AI Project Kickoff
●Scope defined vaguely
●Success metrics unclear
●Assumptions:
○Data is ready
○More data can be found
○Proxy data is good
enough

AI Project End
The cost of failing slowly
●Team is stressed and burnt
out
●The data was NOT ready
●The data did not suit the task
●Models did not deliver
miracles
●Outcome is not actionable

© 2025
19
AI Project Kickoff
●Scope defined vaguely
●Success metrics unclear
●Assumptions:
○Data is ready
○More data can be found
○Proxy data is good
enough

AI Project End
The cost of failing slowly
●Team is stressed and burnt
out
●The data was NOT ready
●The data did not suit the task
●Models did not deliver
miracles
●Outcome is not actionable
What went wrong
●Constant data challenges surfaced
●Assumptions invalidated too late
●Model built in isolation w/o feedback
●Technical debt accumulated
●Decisions are reactive, not proactive

© 2025
20
AI Project Kickoff
●Scope defined vaguely
●Success metrics unclear
●Assumptions:
○Data is ready
○More data can be found
○Proxy data is good
enough

AI Project End
The cost of failing slowly
●Team is stressed and burnt
out
●The data was NOT ready
●The data did not suit the task
●Models did not deliver
miracles
●Outcome is not actionable
What went wrong
●Constant data challenges surfaced
●Assumptions invalidated too late
●Model built in isolation w/o feedback
●Technical debt accumulated
●Decisions are reactive, not proactive

© 2025
21
We need to set projects up in a way
that allows for multiple decision gates

© 2025
22
Decision gate
Team receives data sample
1-week data validation effort
Assumptions (in)validated early
Data sufficiency validated
Opportunity to kill/pivot and rescope

© 2025
23
Team receives data sample
1-week data validation effort
Business
understanding
Data
understanding
Model development
Deployment
Monitoring
Data preparation Evaluation

© 2025
24
Team receives data sample
1-week data validation effort
Business
understanding
Data
understanding
Data preparation
Model development
Evaluation
Deployment
Monitoring
Decision gate
Decision gate
Decision gate

© 2025
25
Team receives data sample
1-week data validation effort
Business
understanding
Data
understanding
Data preparation
Model development
Evaluation
Deployment
Monitoring

© 2025
26
How to fail fast and recover smart
Agile AI Development
●short sprints
●fast feedback
●learn, document,
move on

Versioning
●version data
●version code
●version experiments

Pivot/Kill
●early decisions
●multiple decision
gates

Fail often
●small experiments in
parallel
●analyze what’s/why’s

Lean AI
●test project feasibility
●low cost, low fidelity
models before scaling
up

© 2025
27
Takeaways
Don’t start with AI — start with the problem.
Evaluate opportunities with a structured approach.
Align AI projects with measurable business outcomes.

Validate your assumptions early on.
Choose experimentation over high fidelity prototypes.
Document, communicate, rinse and repeat!
Tags