AI Overviews Explained: Expert Embedding Techniques For SEO Success

SearchEngineJournal 1,641 views 33 slides Aug 28, 2024
Slide 1
Slide 1 of 33
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33

About This Presentation

AI Overviews are here, and they’re making a big impact in the world of SEO. Are you up to speed on how to maximize their impact?

Watch as we dive into the fascinating world of Google AI Overviews and their functionality, exploring the concept of embeddings and demystifying the complex processes b...


Slide Content

Carnegie Institute of Technology at Carnegie Mellon University.
B.S. and M.S. in Computer and Electrical Engineering.
CTO + Co-Founder of Market Brew in 2006.
Google for Entrepreneurs Advisor
Inventor and author of multiple utility patents in both the software and search space.
See more of the story @ ask.thesearchengineer.com
Your Host:Scott Stouffer
scott_stouffer
in/scottstouffer/

Agenda
Introduction to AI Overviews
What are Embeddings?
Introduction to BERT and
Sentence-BERT
How Does Cosine Similarity Work?
The FREE AI Overviews Visualizer
Using Search Engine Models to
Discover Which Structures Google
Prefers
1
AI Overviews Explained:
Expert Embedding
Techniques
For SEO Success

2
Sponsored by
Market Brew
Build your owncalibrated search engine modelsfor any search engine, and understand
algorithm updates immediately.
Then use Market Brew'sgenetic algorithmsto discover statistical gaps between your
landing pages and the outperformers.
The most advanced SEO software platform in the industry will help you efficiently
implement the biggest levers available by auto-generating and classifying tasks.
When you are done, you'll be able toforecast your rankingson
that search engine, using the model as a proxy.
Learn more @ marketbrew.ai

How AI
Overviews
Work
AI overviews summarize
content by understanding
its context and meaning.
3

4
It is a Large Language
Model (LLM) that relies
on “enhanced context”.
1
2 3
4
Prompt
+ Query
Prompt + Query +
Enhanced Context
5
Query
Relevant
Information
for
Enhanced
Context
Generated
Text
Response
Large
Language
Model
EndPoint
Search Relevant
Information
How AI
Overviews Work
Knowledge
Sources
How AI
Overviews
Work

5
The context is guided by
the retrieval of highly
relevant snippets of
content.
2. Augment
Prompt
Query
Context
Query
Response
Embedding
Vector
Database
Context
LLM
1. Retrieve
3. Generate
How AI
Overviews
Work

6
Embeddings are a modern
choice for context retrieval.
How AI
Overviews
Work

What Are
Embeddings?
Embeddings convert text
into numerical form for AI
models, enabling machines
to process and understand
human language.
7
Embedding
Model
Object 1
Object 2
Object 3
0.60.30.1 ...
0.80.50.3 ...
0.40.20.9 ...
SET OF OBJECTS OBJECTS AS VECTORS

Embeddings are vector
representations of words,
phrases, or sentences,
capturing semantic meaning
and relationships in a high-
dimensional space.
8
Decision
Loss
Recall
Bias
Prior
Cluster
Feature
Neuron
Dataset
Correlation
Factor
Class
Network
Transfer
Variance
What Are
Embeddings?

9
What is your annual budget for SEO
tools and platforms?
Poll
Question
#1:
Investment
in SEO Tools

What Are Embeddings?
Embeddings represent complex relationships between words in high-dimensional
space, useful for various NLP tasks. Google’s BERT was one of the pioneers of this
kind of system.
10
King
Queen
Man
Woman
German
Berlin
Swim
Swam
Jump
Jumped
Canada
Ottava
Spain
Madrid
Ireland
Dublin
China
Beijing
Male - Female Country - Capital Verb - Tense

11
Introduction
to Sentence-
BERT
Sentence- BERT is a
modification of BERT
for generating sentence
embeddings.
Sentence A Sentence B
"My father plays with me at the park""I play with my dad at the park"
BERT BERT
Mean
Pooling
Mean
Pooling
Siamese Network
Same Parameters
Both BERT Models
have same
embeddings
Same Weights
1
2
3
Sentence
Embedding
A
Sentence
Embedding
B
COSINE
SIMILARITY
Loss (MSE)
Target Cosine Similarity

12
Introduction
to Sentence-
BERT
It enables efficient
comparison of longer
text segments, enhancing
tasks like semantic search
and clustering.
Embedding chunk

How Does Cosine Similarity Work?
Cosine similarity measures the cosine of the angle between two vectors,
determining their similarity from -1 to 1.
13
Angle close to 00
Cos ( ) close to 10
Similar vectors
Angle close to 900
Cos ( ) close to 00
Orthogonal vectors
Angle close to 1800
Cos ( ) close to - 10
Opposite vectors
0
0
0

14
How Does
Cosine
Similarity
Work?
It is used to compare the
relevance of text to queries,
crucial for effective
information retrieval
and ranking.
Similarity:69.38%
#2Page Level Cluster:
61.49%Top Cluster Score:
Add to Ranking BlueprintSite Level Cluster:
Add to Ranking BlueprintTop Cluster (Site) Score:

15
The AI
Overviews
Visualizer
Shows how a page’s content
is broken into chunks, each
of which gets converted into
embeddings. Query each
embedding to get its Cosine
Similarity and more.

16
The AI
Overviews
Visualizer
By comparing known biased
pages for that query, a
comparison can be made,
aiding in the optimization
of content structure and
relevance for AI Overviews,
Snippets, and more.

17
What Should
We Copy?
Find which chunks of content
Google biases for a given
query, and use that as a
reference implementation to
understand how to re-align
strategy.

18
How important is the ability to
conduct A/B testing and discover
optimization opportunities in your
SEO strategy?
Poll #2:
A/B Testing

19
The AI
Overviews
Visualizer
Test changes using the
“Test New Content”
feature.

20
Practical
Example
Using the
Visualizer
Instantly see if your
changes cause a higher
or lower score.

History of
Search Engine
Modeling
Search Engine Modeling is when
a search engine uses dynamic
weights for each algorithm.
Some advanced models calibrate
to solve for the weights, given a
ranking result.
21
GOOGLE
ALGORITHM
UPDATE

22
These KPIs can be used as inputs
in many useful algorithms
History
of Search
Engine
Modeling
To add new algorithms,
search engine models
use “first principles”
calculations as inputs.
Embedding Similari
es for “cosine similarity”:
Max.
74.88%
Avg.
66.81%
Top Cluster
80.68%
Top Cluster (Site)
65.30%

History of
Search Engine
Modeling
For search engine models that
calibrate themselves, the new
algorithms are included during
the calibration process to see if
they help the accuracy of the
search engine model.
23
Current Algorithmic Settings
Calibrated Boost Factors

Adding New Algorithms
Using search engine models that calibrate themselves, new algorithms are introduced during
the calibration process.
They are then measured to see if they help or hurt the accuracy of the search engine model.
24
A number of derivative algorithms come from these visualizations.
Maximum
similarity
What is the highest
cosine similarity
of any chunk?
Average
similarity
What is the average
cosine similarity
of all chunks?
Heading
similarity
Using cosine
similarity for each
heading tag, derive
a score.
META Title
similarity
What is the cosine
similarity for the
META Title?
Top Cluster
similarity
(Parasite SEO)
Does this chunk
belong to the top
cluster on the
page/site?
DOM Cluster
similarity
(Preferred CMS/
Template)
Associate specific
DOM structures
with rankings.

25
Example:
Top Cluster
Similarity
First, the embedding clusters
are determined for the entire
subdomain / page.

26
Example:
Top Cluster
Similarity
For each embedding, calculate
its page / site cluster.
Lower cluster # means this
embedding chunk is not part of
the top cluster.
This page’s / site’s top cluster is
NOT about this content chunk.
Similarity:71.87%
#1Page Level Cluster:
78.36%Top Cluster Score:
#3Site Level Cluster:
48.08%Top Cluster (Site) Score:

27
Example:
Top Cluster
Similarity
For each algorithm, ask the search
engine model:
If both of these are true, show the
algorithm output for both the target
page and a page that Google biases.
Is this algorithm’s rankings also correlated
with the overall rankings ?
If so, is our score NOT already in the top tier
for this algorithm (the tier of pages that Google
biases for this algorithm)

28
How likely are you to consider using
search engine models to discover
which algorithms that Google biases,
and the pages that it biases for those
algorithms?
Poll #3:
Search
Engine
Models

29
AI Overviews
Visualizer is
FREE
No account needed.
AI overviews, embeddings,
and visualizations of modern
SEO.
Use the FREE version now
https://brew.marketbrew.ai/ai-overviews-visualizer.htm

30
Be an
SEO Leader
Join the next generation of
data-driven SEO.
Already very popular with
top SEO teams around the
world.

31
AI Overviews
Explained:
Expert Embedding
Techniques for
SEO Success
Q&A
AI Overviews, embeddings,
cosine similarity, and search
engine models.
Use the FREE version now
https://brew.marketbrew.ai/ai-overviews-visualizer.htm