What is an SEO Audit and How to Carry Out an Analysis.
Few people know the true importance of doing an SEO audit. Currently, many clients do not understand that you should not start working on the positioning strategy of a new Web project without having to first carry out this complete analysis on them.
Why? Simple, because this type of Web audits helps us both to know what the strengths of the project are and the different SEO errors to correct, in order to improve its visibility on the Internet.
What is SEO Audit?
An SEO audit is a process of review and complete study of a website, in which its general state is analyzed in relation to its ability to position in search engines, looking for errors that may harm its visibility and possible improvement points in topics. SEO, traffic and conversions.
When should an SEO audit be performed?
In the services that I offer to my clients, this is a key aspect to keep in mind before starting to work with any type of SEO strategy. The reason is clear, if you do not know in depth what the current situation of the web is, you cannot know what actions you need to take to improve its positioning.
However, there are many situations and times when it is also appropriate to carry out an audit of this type, we will talk about this at the end of this guide.
Do you want to learn how to do an SEO audit? So, read on and discover a little more about how to carry out this analysis so important for a Digital Marketing Plan or business strategy that aims to be more competitive.
Before starting, I would like to clarify that it is not about focusing on one or two aspects, but rather trying to analyze each factor that intervenes in organic positioning in a comprehensive way, so you do not have to follow a very strict order.
That said, now let’s look at the 7 fundamental points that you need to study when performing an SEO audit:-
1. Web Tracking and Indexing
Without crawling there is no indexing. And without indexing, you can’t position. Our first step is to analyze the indexing and crawling of our site. This section is the most important, since here we will analyze if our content is correctly indexed or if there are any errors at the level of crawling of the pages.
1.1 – Indexing
The first thing you have to consider is that we may not be interested in indexing all the content of a website, but only that which we want to position in Google.
Generally, this relevant content is one that adds value to users who perform a search on Google. Anyone who does not respond to a user’s search intention, you can consider as “not relevant”, since Google is not interested in having filler content among its results.
1.2 – Tracking Frequency
At this point, we seek to detect that there is no problem or block that prevents Google Bots from reaching the different URLs and parts of our website. And what about the tracking frequency? Basically, Google is said to give each website a “Crawl Budget”, a kind of tracking budget, which determines the maximum time the robot has to fully track it.
The important thing here is to try to optimize that time or Crawl Budget to the maximum, letting Google only crawl the pages that we need to position and preventing you from wasting time on irrelevant content. To analyze both indexing and crawling, you can use tools like Search Console and Screaming Frog.
1.3 – Robots.txt file
You should examine the “robots.txt” file to verify that certain URLs or directories on the website are not blocked. If we want Google to index our URLs, it is important that this file is well configured, otherwise the positioning of the project may be seriously affected.
1.4 – Check the sitemaps.xml that are sent to Google
Another aspect that you are interested in checking is that you have sent your sitemaps.xml correctly. Keep in mind that these files help Google to understand the internal structure of your website, which makes it easier to crawl and consequently streamlines the process.
1.5 – Error detection (Status Code)
This section seeks to detect possible HTTP (Status Code) errors that we may have on our website, I mean 3xx, 4xx and 5xx. Errors of this type can affect the user experience and cause the Google Crawler to waste time and skip some relevant URLs.
There is a simple method to detect these errors and it is through tools. Search Console can help you in this, but if you have the possibility to pay for a premium, I personally recommend Screaming Frog or Site Audit from SEMrush, since they are quite fast and give you a lot of interesting information.
A bad information architecture can negatively influence many factors. Therefore, special attention must be paid to this step and control the structure, clusters and depth of the portal, in addition to its internal linking (terms related to usability, crawling and indexing).
2.1 – Structure
One of the essential parts of any SEO Audit is to verify the internal structure of the website. A good architecture begins with the creation of clusters (or grouping of thematic information) that seek to improve the usability and crawling of the portal.
2.1- Depth or Click Levels (Crawl Depth)
Depth, also called click levels, is the number of clicks we must make to reach a specific content starting from the most authoritative URL of the site. Assuming that the home is level 0, we start counting levels of depth according to the clicks we make to reach it by navigating through the structure of the website.
Why is it important to know these levels of depth? At a lower level of depth, the easier it is for the user to get to that URL and the greater the relevance in terms of the positioning that Google gives to that page.
Here we must study the possibility of making improvements, to try to reduce the number of levels we have within the architecture of our website. This data can be obtained thanks to tools such as Screaming Frog, Sitebulb, OnCrawl, DinoRank, SEMrush, Ahrefs and many more.
Crawl Depth: ideally, try to have the important URLs of our portal no more than 3 levels deep from HOME.
2.2 – Internal linking
Another important factor that we must analyze, because it can affect the Click levels, the hierarchy of information, the architecture and is related to the crawling of the Web, is the internal link. Basically, you have to make sure that all relevant content on your site is strategically linked to each other, distributing authority and promoting hierarchical information. This is another way to guide the Google Crawler within the structure of the website and to optimize the internal Page Rank (PR).
3. Usability and site performance
Another technical aspect to consider is the speed and interactivity of the Web, that is, the user must feel that things are easy for him when it comes to finding what he is looking for. An improvement in issues related to fluency when browsing also optimizes the user experience, something that Google values very much today.
3.1 – WPO and site performance
An increasingly important technical factor is the WPO, that is, that your website loads quickly, since not only is intuitive navigation necessary, but also being able to do it quickly and smoothly.
Page loading speed is important to the overall user experience and to Google’s Crawl Budget. There are many factors that influence loading speed, from hosting to poor image optimization, so there are various technical aspects that you should review.
3.2 – Navigability
In general, if you have a clear and easy-to-understand Web structure, with a correct internal link and good navigability, both your users and the Google Crawler will find it easy to get around your site.
Therefore, try to check that the structure is as optimal as possible, with a correct link, well-built menus, implemented Breadcrumbs and others.
3.3 – Mobile adaptability
Mobile First! I remind you that for Google, Mobile comes first. Every day more users visit you from mobile devices. Therefore, check that everything works as well on a mobile phone as on a PC. If not, you will have to work quickly on it, because this seriously affects your positioning. Tests to check if your website is mobile friendly:- Click Here
3.4 – Multi Platform
Remember that a page can be viewed from different operating systems, both mobile and desktop, and from multiple browsers. Therefore, try how it looks on an Android or iPhone device and in a Chrome or Firefox browser at least. Test to check multi device compatibility:- Click Here
3.5 – Web usability
We must take into account the usability of the Web, that is, that it is easy to understand, that what you are looking for can be found quickly, without complex procedures or detours and that everything works as it should.
You have to check that the operation of the page is not very complex, or that it is at least intuitive enough so that the user is not confused when using it. So I recommend that you put yourself in the user’s shoes and try browsing the most relevant sections of the menu.
To analyze usability we can use Yandex Metrica, specifically from its free tool: Session Replay
4. Quality of Content
To name the Content is to talk about one of the most extensive and complex aspects that you should work on in an SEO audit. Therefore, now we will see point by point the different factors that you must analyze, to make it easier for you to carry out this study.
4.1 – Duplicate content
In this section we must check if we have duplicate content within the website or eCommerce (we must also check if there is no duplicate Title, H1 or Meta Description in different URLs). When speaking of duplications, reference is made to those contents that are a copy or are very similar in certain respects to others.
4.2 – Cannibalization
Content cannibalization, as well as duplication, are problems that we often do not consider but that we can have without even realizing it. As for cannibalization, you should know that within the same site this can occur when two or more URLs are working with the same search intention, or when the response provided by content previously existed on the same website.
4.3 – Thin Content
Thin Content, which is scarce content, of very little value to the user or that has no visits. To understand Thin Content, the most effective method is to ask yourself the following question: “Does the content respond to the user’s search intention?” If the answer is NO, then you have low quality content. It is generally the one that we want to avoid being tracked and indexed, since it does not contribute anything to the user, lowers the quality of our site in the face of Google and makes us lose Crawl Budget.
I clarify that, although “Thin Content” can be translated as “thin content”, it does not imply that it refers exclusively to URLs with little text. Since there are contents of within 250-300 words, they do have a lot of value for the user.
To give you an idea, within what we know as Thin Content we could also include poorly translated posts from other languages and Spineados or plagiarized articles. In addition, this type of content can give rise to a penalty by Google Panda.
4.4 – Analyze the titles and meta descriptions of your content
In addition to the content itself, it is also important to optimize the meta attributes for SEO, that is, what we know as “meta title”, “meta description” and the URL of your article; as well as Rich Snippets and similar functions.
Many times we do not usually pay due attention to these attributes, and this can take its toll, mainly once they are already positioned, since these are the method we have at our disposal to capture the attention of users in SERPs.
4.5 – Search intention and optimization of Keywords
When analyzing the use of keywords in the text, you have to pay attention to the following points:
- Analyze the user’s search intention for that term.
- Consider if that content is adequate to respond to each search intention.
- View keyword density (optimal between 0.5-1.5%, as appropriate).
- Study the good use of synonyms, long tails and semantic keywords.
- Evaluate the use of Keywords in the Anchor Text of the links.
- See the naturalness when using Keywords contextually in the content.
- Examine the use of related keywords or synonyms in H2, H3 and similar tags.
With the help of the previous checklist you can do a quick review of the content, and thus make sure that it is not over-optimizing.
4.6 – Analysis of the images
Images are an important part of the content, they help guide users and give visual and aesthetic support, but it is not only a matter of inserting them and now, they must also be optimized.
When you perform an SEO audit, you also need to check that the images you use on your website are correctly uploaded, using the attributes “alt” and “title”.
Also, to improve your WPO, make sure that you lower the weight of these images, so that your URLs don’t weigh too much and take as little time as possible to fully load.
5. Organic Visibility in Search Engines
Now, in this section, a study will be made of the organic visibility that we already have, in order to discover possible points to improve and new word opportunities to position in Google.
5.1 – SEO penalties
Look for notices of manual penalties in Search Console or symptoms of sudden changes that demonstrate a possible penalty by Google. If there is any, it will be studied what algorithm it was and what are the best measures to lift this penalty.
5.2 – Keywords positioned in Google
Every content strategy begins with a choice of Keywords. Therefore, it is necessary that you have both your main and secondary words when creating a new URL.
So, when performing an SEO audit, it is necessary to check the visibility we have for our most important Keywords. There are several parameters to take into account in this analysis, but mostly we focus on the relevance to the business of these terms, on the search intention they respond to, and on the competition.
On the other hand, in this part of the audit, a Keyword Research is also usually done to find opportunities for organic growth. You can use tools such as Ahrefs, SE Ranking or the SEMrush Keyword Magic Tool, which are quite complete and are sure to help you carry out this study quickly and efficiently.
5.3 – Study of the competition
We must not forget that we are not alone on the Internet. Therefore, we must analyze our main competitors and detect the keywords they are positioning.
6. Web Authority and Popularity
At this point, we have to analyze all the different factors regarding the Off-Page of the project we are controlling.
6.1 – Backlinks and authority analysis
In this step, it is essential to corroborate the state of health of our backlinks, that there are no harmful ones, that is, beware of potentially toxic links that can bring penalties and negatively affect our positioning.
Remember to look here, if your inbound link profile seems to be as natural as possible and has a correct relevance or relationship with your topic. Also, check that the Anchor Texts are not over optimized and are in line with the content context.
Link Building is generally a very complex factor, but at the same time, very important within an SEO strategy. Since it can help you obtain more authority for your domain (what we know as DA or Domain Authority) but it is also one of the factors that can attract more penalties.
7. Traffic and Conversion
In this section, it is where we analyze everything regarding the visits the website receives and its behavior during that session. Among other factors, here we see the traffic sources, the URLs visited, the residence time, the page views, the bounce rate and the vanishing points, among others.
7.1 – Traffic Sources
In this step we must analyze where the visits to our pages come from, that is, the sources that are bringing us that traffic. This data can be found in Google Analytics, under Acquisition> All traffic> Channels (Direct, Organic Search, Social, Referral, Paid, etc.).
7.2 – Destination of Traffic
At this point, it is necessary to check which URLs are receiving the most traffic, that is, to see the pages that are bringing the most visits to the website.
7.3 – Vanishing Points
These are the last pages that users have visited or the points where they have abandoned a shopping cart in eCommerce. On this occasion we must analyze where the users are leaving and why they are leaving our site.
Here we must clarify concepts, the bounce rate and the percentage of exits are not the same. While the first refers to the percentage of users who have only visited one URL, the second refers to which page was last visited during a session.
7.4 – Conversion
It is important to be able to analyze the conversion rate of the website or eCommerce and the role of organic traffic in achieving the different business objectives, to see the quality of the visits we receive and to assess the impact they have. This data helps us to work on conversion optimization (CRO).