GreenSEO October 2024 BrightonSEO fringe event slides.
WilliamBarnes38
131 views
129 slides
Oct 06, 2024
Slide 1 of 129
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
About This Presentation
The slides from the GreenSEO October 2024 event.
Size: 8.61 MB
Language: en
Added: Oct 06, 2024
Slides: 129 pages
Slide Content
#greenSEO
@creativebloomUK
About
GreenSEO
#greenSEO
The
GreenSEO
manifesto
1.We will do all within our
power and influence to make
the web a greener place.
2.Each month we pledge to take
one action each towards this
goal.
3.We share, open source, support
and help advise each other.
4.We talk to people about the
impact of the web on the
environment and invite
like-minded souls to join the
greenSEO movement.
#greenSEO
GreenSEO
The
GreenSEO
website
#greenSEO
https://greenseo.org/
GreenSEO
Apply to
speak at
greenSEO
#greenSEO
GreenSEO
Impact
#greenSEO
GreenSEO
THE INTERNET
=4% OF
GLOBAL GHGS*
*Frédéric Bordage 2019
#greenSEO
GreenSEO
Energy
use
#greenSEO
@creativebloomUK
GreenSEO
Data centers!!!
#greenSEO
@creativebloomUK
DIGITAL are the bad guys!!!
#greenSEO
#greenSEO
@creativebloomUK
GreenSEO
#greenSEO
GreenSEO Wikipedia go dark mode
#greenSEO
GreenSEO Google is taking bold action
#greenSEO
GreenSEO
#greenSEO
GreenSEO
Heidi plants some seeds…
Our asks
of you
#greenSEO
@creativebloomUK
GreenSEO
Plant
some
seeds
#greenSEO
GreenSEO
Do one
thing a
month
The
GreenSEO
website
#greenSEO
https://greenseo.org/
GreenSEO
Bookmark
your
everyday
pages
#greenSEO
@creativebloomUK
GreenSEO
1 search = half
a kettle
boiling*
#greenSEO
*Google
Never be
afraid to
just ask
#greenSEO
GreenSEO
All we
have had
is “yes”
so far
Join us
#greenSEO
GreenSEO
#greenSEO
GreenSEO
#greenSEO
GreenSEO
●Case Study: Rowan Chernin: Adventures
in dark mode - introducing
sustainability into user experience.
●Software: Aaron James, Screaming Frog:
Intro of Screaming Frogs carbon
emissions measurement feature
●Practical: Ben Meyer, BM Digital: How
to do a carbon audit on your website
Adventures in dark
mode - introducing
sustainability into
user experience
Rowan Chernin
GreenSEO
LinkedIn | Rowan Chernin
Can we h a greener web experience?
…and deliver our Marketing message
on a platform that uses less
energy?
\zxzxzx
DARK MODE
Selling the value to a
corporate
#GreenSEO
Where do we start?
How do we measure a greener web experience?
What is the value of Dark Mode to the business
and customers?
Internal buy-in
Selling the value to a
corporate
#GreenSEO
Value to the business
Reduce energy use
Internet uses 830 million
tons of carbon a year: if
the internet was a country,
it would rank 7th in
electricity usage
SEO Value
Speed is a Google
ranking factor and
energy usage may be a
ranking factor in the
future
Customer experience
Faster user experience and
less time wasted navigating
a page, use less energy in
dark mode
Selling the value to a
corporate
#GreenSEO
Faster site speed = less energy
Selling the value to a
corporate
#GreenSEO
Unlike LEDs which has one light
permanently on, OLEDs only use
power when activated to
illuminate the light areas of
the webpage
Selling the value to a
corporate
#GreenSEO
Corporate gamification
Beacon measures the carbon used for each new visitor:
Selling the value to a
corporate
#GreenSEO
Win over stakeholders
In favour of dark
mode
Yes: 89%
Internal poll, easy to generate with quick results:
Is a good way to
reduce energy
consumption
overall?
Yes: 100%
Is the page harder
to read?
No: 67%
Selling the value to a
corporate
#GreenSEO
Is the page harder
to read?
No: 67%
Selling the value to a
corporate
#GreenSEO
Stakeholders wanted to switch
between light and dark mode
Selling the value to a
corporate
#GreenSEO
Results:
Page went live with toggle
Localised into different languages
Dark mode on the roadmap
Selling the value to a
corporate
#GreenSEO
What is dark mode
Why dark mode web designs are gaining popularity
Difference between OLEDs and LED devices
Dark mode research data
Google on dark mode
Useful links:
#greenSEO
@creativebloomUK
Intro & demo of
Screaming Frogs
carbon emissions
measurement feature
SEO and Data Manager
Screaming Frog
GreenSEO
LinkedIn | Aaron James
Who am I?
Aaron James
SEO & Data Manager at
Screaming Frog
Iʼm involved in SEO Strategy,
Tech SEO, Spider Support &
Training
Screaming
Frog
We create industry software:
- SEO Spider
- Log File Analyser
Surprisingly, weʼre also an
SEO (PPC/PR) Agency!
Why Consider a Greener SEO Approach?
BBC Climate Change Article 2020: https://www.bbc.com/future/article/20200305-why-your-internet-habits-are-not-as-clean-as-you-think
4.1 billion people use the internet (53.6% of the
Global Population)
BBC Climate Change Article 2020: https://www.bbc.com/future/article/20200305-why-your-internet-habits-are-not-as-clean-as-you-think
Gadgets, the internet and the systems
supporting them account for 3.7% of Global
Greenhouse Emissions according to estimates
in 2020
BBC Climate Change Article 2020: https://www.bbc.com/future/article/20200305-why-your-internet-habits-are-not-as-clean-as-you-think
3.7% is comparable to the global airline industry
BBC Climate Change Article 2020: https://www.bbc.com/future/article/20200305-why-your-internet-habits-are-not-as-clean-as-you-think
This is predicted to double by 2025
By 2040 Projections Estimate Greenhouse
Emissions from the IT Industry will Reach 14%
BBC Climate Change Article 2020: https://www.bbc.com/future/article/20200305-why-your-internet-habits-are-not-as-clean-as-you-think
The Global Warming Threshold Target aims to
reduce emissions by 43% by 2030
BBC Climate Change Article 2020: https://www.bbc.com/future/article/20200305-why-your-internet-habits-are-not-as-clean-as-you-think
So, Every Little Helps
How Can Sites Help to Reduce Emissions?
BBC Climate Change Article 2020: https://www.bbc.com/future/article/20200305-why-your-internet-habits-are-not-as-clean-as-you-think
Evaluate your site to understand carbon ratings
and CO2 emission estimates
BBC Climate Change Article 2020: https://www.bbc.com/future/article/20200305-why-your-internet-habits-are-not-as-clean-as-you-think
Work with SEOs and web developers to create
more efficient, lower carbon websites
BBC Climate Change Article 2020: https://www.bbc.com/future/article/20200305-why-your-internet-habits-are-not-as-clean-as-you-think
Consider using green website hosting
How Can the SEO Spider Help?
The CO2
(mg) metric uses the C02.js library provided by the Green Web Foundation to calculate CO2 emissions.
The CO2.js library estimates carbon emissions produced by moving data over the internet using the data transfer size.
Estimate CO2 Emissions by URL
https://www.screamingfrog.co.uk/seo-spider/issues/validation/high-carbon-rating/
The Carbon Rating metric is calculated using the Sustainable Web Design Model combined with the CO2 (mg) metric.
Carbon Rating Benchmark by URL
The Carbon Rating metric is a benchmark to help the everyday user understand the CO2 (mg) metric - Anything larger than 2.4 MB is a failure.
Carbon Rating Benchmarks
https://sustainablewebdesign.org/digital-carbon-ratings/
Viewing the
Data
View the Validation tab to
view data by URL.
Select the ʻHigh Carbon
Ratingʼ filter from the
dropdown menu.
Setting Up the SEO Spider
Basic Setup
– JavaScript
Rendering
JavaScript Rendering must be
enabled to view the Carbon
Rating and use the ʻHigh
Carbon Ratingʼ filter.
Otherwise, it wonʼt be
calculated!
https://www.screamingfrog.co.uk/seo-spider/tutorials/crawl-javascript-seo/
Account for
Device Type
Does your site get the majority
of traffic from mobile devices?
Your analysis may wish to
account for device type to see
if there are significant
changes between desktop,
tablet and mobile versions of
the site.
Account for
Specific Bots
Your analysis may also wish to
account for different bots.
Some sites can return data
differently based on the bot
crawling the site.
ʻChromeʼ would most
accurately represent a
browser and what is rendered.
https://www.screamingfrog.co.uk/seo-spider/user-guide/configuration/#user-agent-2
Account for
Green Hosting
(If Being used!)
Enable this option if youʼre
aware a website uses green
energy hosting.
You can check if your site runs
on green energy with the
Green Web Foundations Green
Web Check
Consider
Crawling
URLs Blocked
by Robots.txt
Robots.txt controls which
URLs are crawled by bots.
ʻIgnore Robots.txtʼ would be
most accurate – this is due to
what is being rendered on
page.
Sites block resources for
various reasons, but Chrome
will still load and render them
– it ignores robots.txt.
You can always exclude URLs!
https://www.screamingfrog.co.uk/seo-spider/user-guide/configuration/#ignore-robots-txt
Account for
Cookies
We ignore GA tags by default,
but these also have weight
and therefore CO2 emissions.
Enabling ʻCookiesʼ accounts
for these.
https://www.screamingfrog.co.uk/seo-spider/user-guide/configuration/#cookie-storage
Custom
JavaScript
Snippets – Page
Scroll
Enabling the JavaScript
Snippet ʻPage scrollʼ would
provide the most accurate
results by emulating a user
visiting the page.
This ensures additional assets
are loaded such as
lazy-loaded resources.
Another example is when
e-commerce stores load
additional products on user
scroll.
https://www.screamingfrog.co.uk/seo-spider/tutorials/how-to-crawl-with-chatgpt/
Custom
JavaScript
Snippets – Page
Scroll
Control the number of times a
page is scrolled and the wait
time between scrolls.
See this in action by toggling
the globe icon, entering a URL
and clicking ʻTestʼ.
https://www.screamingfrog.co.uk/seo-spider/tutorials/how-to-crawl-with-chatgpt/
Incorporating API Data
Connecting
GA4 API
Sign in, choose your account,
property and data stream as
required.
https://www.screamingfrog.co.uk/seo-spider/user-guide/configuration/#google-analytics-integration
GA4 Date
Range
Consider what date range to
use for your analysis.
A year may help to consider
seasonality for the first time
review.
Consider if content updates or
site changes have been made
in your date range.
GA4 Metrics
ʻSessionsʼ and ʻviewsʼ are enabled
by default.
Metrics such as ʻviews per sessionʼ
are not enabled by default.
ʻViewsʼ will likely provide the most
accurate metric for comparisons.
You can find out more about
analysing the data from Ben
Meyerʼs Carbon Audit article.
GA4 Filters
Consider the dimension type:
ʻsession default channel
groupʼ – reports each separate
channel the user arrives on
the site by.
Finally, select the channel.
ʻCurrentlyʼ, you can only pull
one channel per crawl –
multiple crawls would be
required for additional data.
https://www.screamingfrog.co.uk/seo-spider/user-guide/configuration/#google-analytics-integration
Segmenting URLs Based on Data
Label URLs on scale for quick grouping and analysis.
Segmenting URLs
View
Segment
Summary
Data
Quickly filter URLs by
segment.
Get an overview of a segments
indexability, issues, warnings
and opportunities.
https://www.screamingfrog.co.uk/seo-spider/user-guide/tabs/#segments-2
Segment
Rules
Segment using almost any
metric available in the SEO
Spider.
This includes:
API Data
Custom Extraction
Custom JavaScript Extraction
Data
https://www.screamingfrog.co.uk/seo-spider/user-guide/tabs/#segments-2
Segment by
Carbon
Rating
Set rules using practically any
crawl metric in the SEO
Spider.
Use Regex to streamline your
rules.
https://www.screamingfrog.co.uk/seo-spider/user-guide/tabs/#segments-2
Use ʻMatches Regexʼ and the OR operator (|) to prepare rules quickly.
Segment by
API Data
Combine segments and your
API data to help prioritise
URLs to optimise.
Automating
Crawls
Set a crawl to run
automatically whether itʼs
daily, weekly, monthly or just
once, any time of the day.
Add, duplicate, edit and
delete scheduled crawls.
https://www.screamingfrog.co.uk/seo-spider/user-guide/general/#scheduling
Automating
Crawls
Task name is the individual
crawl name. Using the same
project name allows you to
group multiple crawls under
one folder.
Different project names will
create different folder
groupings.
https://www.screamingfrog.co.uk/seo-spider/user-guide/general/#scheduling
Automating
Crawls
Set your crawl options:
- Crawl Mode: Spider or List
mode
- Crawl Seed: starting URL
- Crawl Config: add custom
crawl configuration settings
- Auth Config: add password
configuration settings
https://www.screamingfrog.co.uk/seo-spider/user-guide/general/#scheduling
Automating
Crawls
Enable any API you want.
- GA, GA4, GSC can be
configured directly in tab.
- Other APIs will require to be
connected and data options
selected via a custom
configuration file.
Automating
Crawls -
Gsheets
Exporting specific data to
Google Sheets allows greater
flexibility.
Basic export settings only
require:
- Google Drive Account
- Format: Gsheet
- Your specific tab, bulk export
or report
Automating
Crawls –
Looker Studio
Export data in a format that
can be readily used in Looker
Studio.
Select Configure the metrics
you want to include.
The ʻHigh Carbon Ratingʼ
metric provides an overview
of the number of pages
classed as having a high
carbon rating.
https://www.screamingfrog.co.uk/seo-spider/user-guide/general/#scheduling
Automating
Crawls -
Looker Studio
Exporting data for Looker Studio
provides data in a dated overview
form allowing you to integrate a single
Gsheet thatʼs updated over time.
How to
conduct a
website
carbon audit
Ben Meyer
BM DIGITAL
GreenSEO
LinkedIn | Ben Meyer
benmeyer.digital
Hi! I’m Ben
What we’ll cover
●Principles of a low carbon website
●The differences the configuration can
make to the output of a crawl
●Why these are important
●Accounting for ALL traffic, bots
included
●Tailoring your crawl to your website
●Data & Deliverables
●AI & scale of the problem
What we won’t
cover today
●The digital carbon emissions
calculation model itself and
any perceived shortcomings
●Third Scope Emissions
●The difference between Co2
and Co2e
●The complications of
Content Delivery Networks
(ask me at the end!)
(Because I will be of ZERO
help!)
Who here’s a techy?!
What makes a low carbon website?
Basic Principles:
●What happens when you put a url in your address bar or
click a link?
Source: LinkedIn
What makes a low carbon website?
●Sustainable Web Design guidelines
●Golden rule: Aim for each page to load in at
or below 1MB
●When we start crawling, we will get different
results depending on a number of factors.
●This can seem overwhelming at first, but it
will become quite logical, allowing for highly
custom and more accurate website carbon
crawls at scale.
●Full calculation model available on
Sustainable Web Design website along with
the official guidelines from the World Wide
Web Consortium's (W3C) Sustainable Web
Design community group.
JS Rendering
Screaming Frog:
“This means pages are fully rendered in a
headless browser first, and the rendered
HTML after JavaScript has been executed
is crawled.”*
Important because:
-Renders more resources than plain
text
-Essential for scrolling snippets &
advanced crawling
*Source:
https://www.screamingfrog.co.uk/seo-
spider/tutorials/crawl-javascript-seo/
Custom JS Snippet:
Scrolling the page
THE MOST IMPORTANT PART
●The problem of the 1mm gap
●All website carbon measurement
tools use some way of calculating
page weight.
●Don’t trust Google Lighthouse!
○THE READINGS ARE
WRONG!!!
https://allrecipes.com
Custom JS Snippet:
Scrolling the page
THE MOST IMPORTANT PART
PROOF!!
Green Hosting
Calculation
Super easy - if it does, turn it
ON, if it doesn’t - DON’T!
Complicated slightly by CDNs.
Check with your server
providers or infrastructure
teams or use the Green
Website Foundation.
Include or
Exclude Cookies
Everybody HATES Cookieeeess!!
Compliance issues, errors, technical
headaches, cost.
I saw a client’s website improve from
a B to an A, purely by removing 5
trackers.
Use them wisely, avoid them if you
can and have a process or strategy in
place for removing them once you’re
done using them.
Respect or reject
robots.txt
Might be tempted to crawl
EVERYTHING but stop!
ASK yourself: What is there to gain by
not respecting robots.txt ?
Check with the devs - usually, unless
there’s significant content behind this
that is being actively used e.g. a
headless CMS, it will probably be best
to respect whatever’s there.
You also won’t be able to correlate
this with GA Views data very easily.
Device & Browser
Emulation
Important because:
-Websites render differently on
different devices
Screaming Frog can emulate a range
of devices & browsers. This is useful
because different hardware and
software load pages differently and
you will see different results.
Could run several crawls as different
devices and browsers depending on
the makeup of your audience over a
year.
Tailoring your crawl to your client
●Identify your client’s digital audience
○How many hits?
○Most common events
○Is it transactional? Are there paths that users have to follow to use the
platform? E.g. E-Commerce, Membership, Ticketing System etc.
○Are there any protected areas to account for?
■Are these considered valid for the crawl?
○Are they visiting on mobile, desktop or tablet?
■What are you trying to drive towards?
WHO
WHAT
WHY
HOW
Tailoring your crawl to your client
Tailoring your crawl to your client
●WHO is your client? Will they be
interested?
●WHO will they be presenting this
data back to? What will THEY be
interested in?
●WHAT would be the real world
impact of any improvements?
●WHY has your client asked you to
do this?
●HOW can this save them money or
help them make more?
●HOW can this help them achieve
their Mission Objective or their
Website Objectives?
●HOW are you going to shape your
findings into an action plan?
○It will likely appear very
technical.
●WHAT resources do they have to
take meaningful action?
Tailoring your crawl to your client
Interesting crawls to run might be:
-A close to fully accurate crawl as you can manage using all the data available
e.g. opt in rates for cookies, GA4 traffic makeup, respecting robots.txt, loading
all videos etc.
-Purchase path analysis
-Cross referencing server logs to account for bot traffic.
Screaming Frog SEO is the only crawler (that I know of) to be able to crawl at
such scale with such accuracy.
Disclaimer: Make it clear that this will always be an estimation only.
Deliverables: A Starting Point
Let’s go to Looker Studio!
My FREE Looker
Studio Template
Blending Crawl Data with Live GA4 Data
A quick note on the scale of the problem
Your website is not insignificant - but in the context of your entire organisation’s
carbon emissions, you might find that it’s not as significant as other factors.
Don’t feel bad if your website isn’t great right now. Realistically, it’s only one part
of your company's total carbon emissions and it’s important to identify the
biggest contributors.
Every online call you make, ChatGPT query you submit, training video you watch,
company laptop you dispose of, train ticket you buy, Pret A Manger purchased -
has an emission.
Put your effort where it stands to have the most impact.
Next, some more worrying stats…
Video Content Emissions
●Tik Tok produces 2.63g of Co2 per
minute, per user. (Source)
●TikTok releases 40.151 tons of carbon
dioxide per day across its user base
(Source)
●It has 1bn monthly active users.
Assuming a third of them are daily users,
TikTok emits “more CO2e each year than
it would take to fly the entire population
of London to New York and back.”
Approx. 13.6m tonnes for flights (8m
population of London, 1.7 tonnes per flight)
Approx. 14.6m tonnes for TikTok
Video Content Emissions
-In 2022 Netflix reported that fans clocked up more than 6bn hours
watching the top 10 shows in the first 28 days after each show was
released.
-The Carbon Trust says that the European average for streaming is about
55g / 56g of Co2e per individual, per hour.
-Based on this estimate this means that the Co2 emitted in 2022 from this
activity “equates to about 1.13bn miles (1.8bn km) of travel in a car – the
approximate equivalent of the current distance between Earth and
Saturn.” - The Guardian | 2021
-In 2024, Netflix reported over 100 billion hours, meaning that roughly, on
the same calculation, we could make the journey to Saturn… 22 times ??????
Artificial Intelligence Emissions
-A Google search emits 0.2g of Co2 (according to Google in 2009). Today, a
ChatGPT query is estimated to emit 4.32g of Co2.
-This is 21.6 times more emissions and just 16 queries would be the
equivalent to boiling a kettle.
-“If our scenario unfolds for a 10x increase in AI and data-center energy
needs, the latter could consume about 17% of US electric power
generation in 2030…” - Bloomberg Intelligence
-Microsoft just bought a Nuclear Power Plant to power their AI data
centers.
-“The company will purchase the plant’s entire electric generating
capacity over the next 20 years.” This is “enough to power 800,000 homes
every year”. - MIT Technology Review & Bloomberg
Scale is the problem.
It’s also the solution.
You CAN make a difference
https://dodonut.com/
Recommended Experts
-Wholegrain Digital
-Liam McLaney - Branch Agency
-Josh Stopper - Beleaf - AUS based
-Nick Lewis - Lead Developer at Leap
-Leap.eco
-Jerome Toole
-Mightybytes - US based
-Andy Davies
-Scott Stoneham - Digital Carbon Online
Any Questions?
Sources for all data available on next few slides.
Sources:
Website Carbon Auditing
-How to conduct a website carbon audit using Screaming Frog SEO | BM DIGITAL
-The 1mm gap | Wholegrain Digital
-Green Web Foundation
-Website Carbon
-Looker Studio data explorer of Twickenham Choral - compares 11 crawls of the
same site under different configs
-Vercel | How UX demands and DIY infrastructure headaches shaped the next
generation of web application delivery
Sources:
Video - data between 2021 and 2024
-Social Media Emissions | Statista
-Social Media Carbon Emissions Research | Bankless Times
-Social Media Emissions | Greenspector 2021
-Tiktok Emits 3 times more carbon than Facebook per minute of use | Bankless
Times 2023
Sources:
Artificial Intelligence Emissions & Energy Usage data
-Today Explained | AI’s nuclear option - Spotify Podcast
-Microsoft AI needs so much power it's tapping site of US Nuclear Meltdown -
Bloomberg
-Data center emissions probably 662% higher than big tech claims. Can it keep up
the ruse? - The Guardian - 2024
-ML Co2 Impact Calculator
-Using GPT4 consumes up to 3 bottles of water to generate 100 words - Tom’s
Hardware