Making the Web Greener - Starting with SEO

Jul 9, 2024

8

min read

Did you know that the internet contributes to nearly 4% of global carbon emissions? That’s even more than the aviation industry, which stands at 2.5%. 

Every time a website gets visited, carbon is released due to the resources required to load the various elements that make up a webpage.

One thing I've been looking at recently is how we can encourage meaningful change in the SEO industry and how we can be a part of making the web greener.

I believe that there are important ways in which the industry can adapt to become more sustainable and greener—you can read more about them below.

Encouraging clients to use a green web host

Many web hosting providers are run on what is called dirty energy, while thankfully, many out there are now powered by renewable energy.

There's a great tool by the Green Web Foundation that lets you check if your site is on a green host or not - the Green Web Check.

Whilst this tool isn't 100% accurate (it says my own website is on a green host yet this is because CloudFlare powers the DNS, which gives a false result), it is a great starting point to dive deeper into this topic.

Encouraging clients to use a green web host

The Green Web Foundation also has a directory of green web hosts should you be in the market for finding a new web host. This list is sorted by country, so you can easily find a host located where your website gets most of its traffic, which is an important item to consider.

Reducing the amount of unwanted bot traffic on your host

For anyone who's spent time analysing server log files, you would have noticed how much traffic on the web is, well, kind of pointless. 

Estimates from CloudFlare show that up to 30% of all web traffic is bot-related, as you can see below, and it has been as high as 40% in the past.

 CloudFlare show that up to 30% of all web traffic is bot-related

If we think of the number of resources that get used up by these bots every time they hit a website on a server, we soon start to realise that a huge amount of energy and water is being wasted, let alone the carbon that is emitted, simply from serving these bots. 

Whilst many SEOs will be reluctant to want to block any important SEO software crawlers (such as Ahrefs), it’s important to realise that these will incur a significant cost in terms of our resource usage. But if we want to retain the use of these tools, what about finding out who the other bots belong to - and blocking them if we don’t need them?

Servers

A little while ago, it was suggested on Twitter that WordPress (which powers 40% of the web - stat) should block all known bot crawlers by default. But this would be a very drastic (and highly unpopular!) action to take.

Joost de Valk Tweet

Instead, we could figure out which bots are hitting our web hosts regularly, determine if they are needed, and then block them at the robots level or through a security tool like CloudFlare.

Be wary that you could end up blocking all bots - including search engines like Google - which would not be ideal! This recently happened with the website of former UK Prime Minister Liz Truss; for several weeks, it wasn’t included in Google’s search results.

Matt Tutt Tweet

Check a company's environmental credentials before you work with them

I've seen many case studies published by SEO agencies that have worked with clients in particular industries, and one I see popping up fairly regularly is that of the fast fashion industry. 

I think if the topic of the environment is important to you on a personal level, then it would be important to stop and think before agreeing to work with clients in industries that you know are problematic if you possibly can.

So before you sign that contract or take on that client within an agency, ask yourself if you're comfortable with how their business operates. Do they publish sustainability records on their website? Do they have a history of greenwashing? Thankfully, this topic is being discussed more and more openly, so it shouldn’t be hard to find out if a company does care about environmental issues.

Obviously, as SEOs our jobs are to help businesses grow online, usually through generating more sales and revenue from organic traffic. Do we want to be associated with helping a business that is seen to deplete environmental resources, or one that takes advantage of their workforce, largely for the benefit of their shareholders?

For those who are working within an agency, hopefully, you will find that you have a strong and mature manager who will allow you to opt out of working with clients from particular industries that you would rather avoid—for whatever reasons.

Adopting the use of the IndexNow scheme

The IndexNow project has been around for a few years now and is a project that aims to make it easier for webmasters and site owners to tell search engines that the content on their website has been updated.

indexNow org

As you will note above from their website, IndexNow works in a similar fashion as an XML sitemap. It pings search engines whenever new content is published or updated by a particular website.

This ping will allow a search engine to come and update their search results, without the webmaster having to wait for a search engine to manually discover the update. 

Ahrefs were one of the first SEO platforms to strike up a relationship with IndexNow, announcing a partnership back in March which will allow customers to get notified when there are issues with their site as well as being able to get content indexed quicker.

Ahrefs screenshot

Image source: Ahrefs

IndexNow appears to be a good, energy and carbon-saving option because it makes an SEOs work more efficient - search engines no longer need to repeatedly re-crawl pages or sites checking for updates, as they can wait for the ping from IndexNow. But this would only work if search engines like Google adopt the scheme, which they as of yet haven’t done. 

This is similar to Cloudflare’s own Crawler Hints program, which pings search engines when content changes on a Cloudflare-powered website. 

When explaining the environmental benefits of the crawler hints program, Cloudflare state:

If 5% of all Internet traffic is good bots, and 53% of that traffic is wasted by excessive crawl, then finding a solution to reduce excessive crawl could help save as much as 26 million tonnes of carbon cost per year. 

According to the U.S. Environmental Protection Agency, that's the equivalent of planting 31 million acres of forest, shutting down 6 coal-fired power plants forever, or taking 5.5 million passenger vehicles off the road. - Cloudfare

If 5% of all Internet traffic is good bots, and 53% of that traffic is wasted by excessive crawl, then finding a solution to reduce excessive crawl could help save as much as 26 million tonnes of carbon cost per year. 

According to the U.S. Environmental Protection Agency, that's the equivalent of planting 31 million acres of forest, shutting down 6 coal-fired power plants forever, or taking 5.5 million passenger vehicles off the road. - Cloudfare

If 5% of all Internet traffic is good bots, and 53% of that traffic is wasted by excessive crawl, then finding a solution to reduce excessive crawl could help save as much as 26 million tonnes of carbon cost per year. 

According to the U.S. Environmental Protection Agency, that's the equivalent of planting 31 million acres of forest, shutting down 6 coal-fired power plants forever, or taking 5.5 million passenger vehicles off the road. - Cloudfare

Paying attention to technical SEO and practising good UX/accessibility

There are lots of areas within SEO, particularly technical SEO, whereby there are some pretty direct ways in which you can correlate your SEO work with making the web greener.

For a start, when working on a large enterprise or ecommerce site one of your biggest tasks will likely revolve around making crawling the site more efficient.

There might be spider traps to fix, architecture issues, redirect chains, site speed issues, and all kinds of technical issues, which, if left untouched, result in more use of server resources and probably a frustrating site for the user to navigate.

By addressing these technical SEO issues while also caring about site usability and accessibility, you can end up not only improving a website for the end-user (and Google) but actively reducing its environmental impact at the same time.

In Summary - Can SEOs make the web greener?

Hopefully, after reading the above, you will realise that there are many ways in which SEOs can choose to make the web greener, many of which result from simply doing our jobs properly.

From implementing technical SEO changes to making site navigation clearer and easier to use, there are a near-endless number of ways in which you can reduce your website’s carbon emissions. 

If you want to read even more about sustainable SEO, I recommend the MightyBytes website. They cover the topic in great detail and share other ways SEO can be tied to making the web greener.

Aside from all of the above, one of the best things you can do is to start talking about this topic more often. Find opportunities to write about it - like I’ve done here with Advanced Web Ranking, talk about the topic on social media, or speak with your friends and colleagues about it.

Matt Tutt

Article by

Matt Tutt

Matt Tutt is a freelance SEO consultant that specialises in helping charities and other purpose-driven brands to get found online. 

Having over 15 years experience of working as an all-round SEO specialist, Matt has helped countless businesses to grow their online traffic and to find new customers from Google. 

Matt loves talking about sustainability and how it can relate to SEO, and when he's not growing the organic visibility of a client he can be found out in his garden, where he's trying to grow as much food as possible (organically, of course!)

Share on social media

Share on social media

stay in the loop

Subscribe for more inspiration.