When it comes to Search Engine Optimisation, website architecture is extremely important. Most users will leave your website if it takes them more than 3-4 clicks to reach the information that they require.
Think of it this way, if you visited a supermarket to buy something but there were no labels, no signs, and the products were not properly organised into categories – you would probably give up and go to another store.
The same thing applies to websites as well. Therefore, it’s extremely important that you structure your site in an easy-to-navigate manner. A sound website architecture results in a good user experience and with the latest page experience update, Google has started giving user experience a lot more importance.
As your site grows larger, it becomes more difficult to maintain a sound site architecture and identify indexability issues. Let’s first understand what site architecture means and why it’s important for SEO.
Website Architecture refers to the hierarchy in which your website pages are arranged. Your website’s structure should help users and search engine crawlers navigate your site easily.
You can expect high retention and conversion rates if your site enables users to reach the content that they want to see easily.
Usually, a website has one of two site structures: flat vs deep.
Flat Site Architecture:
A flat website architecture means that the user can access any page on the website in 4 clicks or less.
This is what a flat site architecture looks like:
Deep Site Architecture:
A deep website architecture refers to a structure where it takes the user more clicks (usually > 4) to reach a specific page on the site.
This is what a deep site architecture looks like:
Is Website Architecture important for SEO?
Site Architecture is very important for SEO for various reasons:
1. Link Equity (a.k.a “Link Juice”) is passed from one page to another:
Proper internal linking to pages passes link authority to those pages. If your critical pages are too deep in the website, then less link authority will be passed on to them.
Ensure that the user can visit your critical pages with 4 or fewer clicks.
2. Search Engine Crawlers have a fixed crawl budget:
Search engine crawlers, such as Googlebot want to crawl your site without overwhelming your servers. So, these crawlers have a fixed crawl budget or limit. If you have a large site and have buried your critical pages deep in it, then these pages will be crawled less frequently, thus affecting your site’s SEO.
3. Site Architecture affects the User Experience:
This indirectly affects your SEO. If you have a deep website architecture and it takes the user very long to reach the page that they want to see, they’ll never convert and will probably leave your site before even reaching that page.
Thus, it’s recommended to have a “flat” website architecture.
Now that we understand what website architecture means and why it’s important for SEO, let’s see how you can analyse your site architecture and fix any issues that may arise.
How to use Screaming Frog to analyse your Website Architecture?
In the previous post “How to use Screaming Frog to Find and Fix Broken Links (404) for Free?“, I talked a little about Screaming Frog SEO Spider. It’s a great tool that allows you to do a bunch of SEO stuff, such as finding broken links, generating sitemaps, analysing site structure etc.
You can download the free version from here, which allows you to analyse up to 500 URLs. This is a great tool for small websites and if you have a website with more than 500 URLs, you can always analyse the site in smaller batches (with the help of exclude) or just purchase the paid version.
Let’s see how you can analyse your website architecture and identify any potential issues. For this, I’m using Nomadic Matt (https://www.nomadicmatt.com/) as an example.
Step 1: Start the Website Crawl
Enter your website URL in the address bar and click on start to initiate the crawl.
Step 2: Look at Visualisations to analyse Site Architecture
Visualisations help understand the site architecture in a slightly easier manner as compared to spreadsheets. Screaming frog has two different types of visualisations: crawl visualisations and directory tree visualisations.
In this tutorial, we’ll just be looking at crawl visualisations.
What are crawl visualisations?
Screaming Frog SEO Spider’s crawl visualisations show how the SEO Spider has crawled the site, by shortest path to a page. They show the shortest path of a page from the start URL, which in most cases is the homepage of the site.
Note: If one page has multiple shortest paths, the visualisation will show the path that was crawled first.
Screaming Frog’s SEo Spider has two different types of crawl visualisations: Crawl Tree Graph and Force-Directed Crawl Diagram.
Let’s look at both of them.
Step 2.1: Analyse the Crawl Tree Graph
Just select the Crawl Tree Graph option from the visualisations menu.
You’ll see a similar visualisation. The circles represent the pages and the lines connecting these circles represent the links between these pages.
The tree diagram by default is arranged in a hierarchy (based on crawl depth) from left to right. However, you can change the orientation to top-to-bottom easily.
You’ll notice that if I remove the labels and just get a bird’s eye view of the website architecture, it looks like a flat website architecture as mentioned above – with every page accessible in 4 clicks or less.
You’ll also notice that the circles have different colors. The non-indexable URLs are represented by a red circle. The indexable URLs are represented by green. You can also click on the information icon to get more information about the color of the nodes/circles.
You can hover over the circle to find out the reason for non-indexability of the URL.
This color-coding system makes it extremely easy to identify problem areas. For instance, if we look at this screen recording above, we’ll notice that a lot of the URLs linked to best -travel-credit-cards (https://www.nomadicmatt.com/travel-blogs/best-travel-credit-cards/) are non-indexable.
Hovering over the circles show that these URLs are actually blog category URLs and have the noindex tag in them.
It’s common practice to noindex category pages and thus this is a valid reason for non-indexable status.
Step 2.2: Analyse the Force-Directed Crawl Diagram
Just select the Force-Directed Crawl Diagram option from the visualisations menu.
Force-Directed Crawl Diagram represents similar information to the Crawl Tree Graph – just in a slightly different manner. The nodes represent the URLs and the lines/edges represent the path between two URLs.
The nodes become lighter and smaller as the crawl depth level increases. So the homepage (where the crawl usually starts) will be the biggest and the darkest node. The color represents indexability of the page (same as in the Crawl Tree Graph).
If you wish to focus on a specific area in your site, you can easily do that by right-clicking on a node and pressing focus here. This will allow you to narrow down on one area and identify problem URLs in that section without getting overwhelmed by the number of nodes in the visualisation.
Screaming Frog’s SEO Spider is a powerful tool that you can use to analyse your website sturcture, identify problem areas and improve your site’s overall SEO.
Make sure to go through the other post in the SEO Spider series to understand how you can find and fix broken links on your site using 301 redirects.
This blog is very informative. Thank you
Thanks a lot Dhiraj. I’m glad you found it helpful.