In this chapter we will equip you with the skills to diagnose a drop in organic traffic to your site using a number of tools like Google Search Console and Google Analytics.
Being able to diagnose a drop in traffic is a fundamental tenet of SEO, and something worthy of dissection. There are several things to consider when undertaking the diagnosis, and getting it done right will help facilitate the prognosis and treatment you’ll need to get your website back on track.
Google Search Console
Turning to Google Search Console is our preferred method with which to diagnose a drop in traffic. This is because although third-party external tools are useful, no other traffic source is as accurate and representative as Google.
Moreover, Google Search Console allows you to investigate traffic at a more granular level by looking at metrics such as clicks, impressions, click-through rate, and the average position of your keywords.
The first thing to do if you see a drop in traffic is to establish when it happened.
Next, decide on a window with which to view your data through. By this we mean choose a timeframe; you’re better off looking at data across the last 6-12 months rather than the entire history. If you undertake the latter it may not best reflect current performance.
We prefer to set the performance data for the last 12 months. Choosing to show data over this amount of time allows you to take a step back to understand the bigger picture whilst also keeping things fresh and manageable.
You can do this by going to Performance and then Search Results on the left-hand side.

The button to the right of the screen allows you to extract the data for a specific date range.

As with the example below, we can deduce that the ‘drop’ in traffic occurred on or around January 4th 2020 (we will discount the trough seen just before as this was a temporary dip rather than denoting the longer-term drop).

As an extra precaution, we recommend corroborating the GSC data with Google Analytics to ensure that the information on both services is consistent.

The next step is to isolate the date that the drop occurred to allow us to see how the performance differed in a given period before and after the drop.
For the sake of clarity, we recommend that you look at the traffic 3 months either side of the date (in our case, this is 7th January 2020). A three month gap generally provides a representative picture of the performance before and after.
Note: If you are looking into the short-term effects of an algorithm update, this is likely to be less (i.e. 2 weeks either side).
We then export the keyword (query) and page (absolute URLs) data that GSC kindly sorts for us automatically.

Remember to sort the rows per page to the maximum, as otherwise GSC will only export the default amount of 10 pages when you first get to the page.

Note: GSC will also give us impressions and position in the data, but for a truer representation of a performance drop, only clicks and CTR will be used.
What we then must do is add an extra column, Δ, which will signify the change in value over the period.
A simple subtraction formula (i.e. subtract the data in column B from column C) returns the difference between the two columns.
=C2-B2
This allows us to sort from lowest to highest to see which keywords saw the biggest drop in clicks for a given page.

It also shows which pages saw the biggest drop.

We can then use this information to look into the specific pages for key issues such as user intent issues with the content or cannibalisation/duplicate content. A full list of these potential issues we look into are appended below.
Google Penalties
Once you’ve identified when your traffic started to decline, the next step is to identify what caused it. One of the most common causes for a significant drop in traffic is a Google penalty.
Manual Penalties
Instead of a natural shift or a drop in rankings, there may be a drop in traffic due to a manual penalty from Google.
Taken from Google’s Search Console Help page:
“Google issues a manual action against a site when a human reviewer at Google has determined that pages on the site are not compliant with Google’s webmaster quality guidelines. Most manual actions address attempts to manipulate our search index. Most issues reported here will result in pages or sites being ranked lower or omitted from search results without any visual indication to the user.”
You can check this on Google Search Console by navigating to Security & Manual Actions and then Manual Actions.

If you do not have a manual action, you will see something like this.

If there is, you’ll see the manual actions against the site, which you would then have to investigate via the accompanied description and submit a Request Review after the issues that violate Google’s webmaster guidelines have been addressed.
Every Manual Action notification will have additional information regarding their origins. A list of common manual actions includes, but is not limited to:
- User-generated spam
- Structured data issue
- Spammy free host
- Unnatural Links (to and from your site)
- Thin content with little or no added value
- Cloaking and/or sneaky redirects
- Pure spam
- Duplicate Content
Fixing Manual Actions
Google provides detailed instructions on how to go about addressing each kind of manual action in this report.

Below, is the general approach for fixing a manual action.
1. Identify which pages have been affected. In the example above, the action pertains to “some pages”.
2. Follow the “Learn more” link to see detailed information and steps to address the respective issue.
3. Be sure to fix the issue on all affected pages. Only fixing some of the pages will not result in the manual penalty being lifted. Likewise, if your website has more than one penalty, it’s important to fix them all.
4. Ensure that the pages that have been affected (and fixed) are accessible by Google. This is because these pages will be reviewed by Google. Test the accessibility of your pages by using the URL Inspection tool.
5. Select Request Review in the same GSC report, this is where you show Google that your website is no longer in violation of Google’s guidelines via a Reconsideration Request. A good request should address the following three things:
a. Explain the exact issue on your site
b. Describe the steps taken to solve the issue.
c. Document the outcome of your efforts.
6. Once submitted, Google will review the request (this can take several days or a week) and update you with their evaluation via email.
Algorithmic Penalties
There are also algorithmic penalties, or filters, that are a part of Google’s main algorithm that can act as a penalty as well. These are not physically reviewed by a Google employee like a manual penalty, and are therefore harder to trace.
Google algorithms such as Panda and Penguin were crackdowns on content and link optimisation techniques that are now considered outdated.
Indexation Issues
A drop in rankings and traffic may also be a result of indexation issues, such as pages excluded from the index that perhaps should not be.
It’s necessary to check ‘Errors’ in GSC’s Coverage Report, which is a comprehensive overview of all the pages on the site and how Google is evaluating them for indexation on the SERPs.
Below is an example of how GSC visualises the statuses of your URLs.

Examples of Errors include submitting a page for indexing but having it set to rel=noindex, or 404 errors. In short, it’s anything that is preventing Google from indexing a page. These are errors that will likely contribute to pages falling out the index but flagged by GSC in red.
In amber will be pages that are not absolute errors, but could be, for example, if a page was indexed, despite being blocked by robots.txt. This is marked as a warning because Google is not sure if you intended to block the page from search results intentionally.
In grey will be those pages excluded from the index by design. Remember it’s normal that all of the pages on your site are not indexed. Your goal is to get the canonical version of every page indexed. Any duplicate or alternate pages will be labeled as excluded. It’s therefore worth checking this section that there are no canonical versions excluded.
In this case, the error is that a submitted URL is marked ‘noindex’. This is to say that the client submitted this page for indexing, but the page has a ‘noindex’ directive either in a meta tag or HTTP header.

This represents a general problem we see of pages being mistakenly taken out of the index (or put into the index) that we do not want to be. This is how traffic can be getting lost, and it is one of the key reasons why this usually happens.
You can validate the fix after the issue has been resolved, flagging to Google that you have fixed the problem. If Google thinks you’ve done the right job, it’ll get marked up as Passed as you can see from the screenshot above.

Google Analytics
Google Analytics is extremely useful in identifying which pages have lost the most traffic. This is where you can get a first-hand look at your keyword movements and performance metrics on GSC.
Step 1: Understand the Drop

To understand the drop in traffic, it’s important to think critically in order to fully uncover what the cause of the decline is.
A good place to start is by asking yourself these questions:
1. Is the drop a tanking or a steady decline?
2. How long was it down for?
3. Is it recovering?
As with our example below, we can see that the temporary dip in December 2019 was not fully representative of the overall drop as the traffic then recovered for a full month before another, more gradual decline occurred.

Step 2: Zoom Out

Which brings us on to the next point: zooming out.
This step is important because it allows for you to take a step back and truly understand the trend relative to what you should expect.
This is particularly important for websites that are subject to heavy seasonality trends.
For example, if you have a website that sells halloween costumes, then there is an obvious trend of peaks in the lead up to October 31st.

Therefore, checking trends data can also help identify the cause of a drop in traffic. In our example above, we can assume that as demand for “halloween costumes” decreases, so will the traffic.
Context is key.
Step 3: Understand Which Pages Have Been Affected

Much like GSC, GA also gives you the options to check which pages have seen the largest drop in traffic.
Navigate to Acquisition > Overview, then after clicking on the Organic Search channel, click Landing Pages to pull up the metrics for specific pages.

You can choose a comparative time period to evaluate the difference in traffic difference at the top right of GA.

Step 4: Isolate Reasons

The last step is perhaps the most technical, but also the most important. This is to isolate reasons for the traffic drop. This step is where you will likely reveal the answers as to why a drop happened.
The beauty of Google Analytics is that it facilitates the isolation of certain metrics so you can take a granular look at your customers’ behaviour on your site, as well as who they are and where they’re coming from.
For example, a high bounce rate (the rate at which users leave after landing on your web page) can be an indicator of why traffic has declined for a particular web page.
We can see in the screenshot below, that Tablet users seem to have a much higher bounce rate than those using mobile and desktop. Therefore, this would allow us to see how we may be able to optimise particular web pages so that they offer a better experience for tablet users.

This is but one of the myriad examples where you can use the power of GA to look into a traffic drop and find solutions.
Google Broad Core Updates
Google Broad Core Updates are another factor to consider when looking for potential reasons for a drop in traffic. These are different to algorithms like Penguin and Panda as websites and webpages that are impacted haven’t violated Google’s “webmaster guidelines nor been subjected to a manual or algorithmic action”.
Instead, broad algorithm updates are designed to improve how Google’s “systems assess content overall”. Essentially, Google’s Broad Core updates are representative of how your top 10 favourite movies or albums change over time.
This means that there is no definitive “fix” for such an update.
As a result, broad algorithm updates should be approached slightly differently in that the focus here should be on ensuring that the quality of your content is optimal.
The good news is that Google tends to announce such updates.
For example, Google announced their May 2020 Core Algorithm Update on May 4th 2020 which happened to be one of the most aggressive shifts in SERPs in a long time.
Later today, we are releasing a broad core algorithm update, as we do several times per year. It is called the May 2020 Core Update. Our guidance about such updates remains as we’ve covered before. Please see this blog post for more about that:https://t.co/e5ZQUAlt0G
— Google SearchLiaison (@searchliaison) May 4, 2020
Therefore, we’d strongly recommend that you follow (or at the very least keep an eye on) the Google Search Liason Twitter account.
Knowing that webmasters are given a heads up, there are some quick checks that you can make to identify whether your website has been impacted.
Site Search
A simple site search in Google is a good indicator as to whether your website has been affected or not.
Enter the following into Google Search:
site:yourdomain.com

If your homepage does not appear first then there is a high chance that the broad update may have impacted your performance in the SERPs.
Ahrefs
A great tool for quickly checking a drop in traffic is Ahrefs. After entering your domain into the Ahrefs’ Site Explorer tool, navigate to the Organic Search tab. This displays graphs depicting your organic traffic and keyword visibility over time.
If you are seeing a sharp decline it may be a manual (which you can then confirm in GSC) or a broad core update (which you can identify by seeing it coincides with Google’s announcement).
For example, we can see below that the organic traffic and keyword visibility for this website plummeted on May 4th 2020, which coincides with when the latest broad core update started to roll out.

SEMrush Sensor
Another great tool which can help confirm and identify whether or not an update is on the horizon is SEMRush’s Sensor.
SEMrush Sensor measures the volatility in search results across the web, tracking down 20+ categories on mobile and desktop and highlighting the possibility of a Google Update effectively.
For example, we can see that on May 4th, the Sensor detected significant shifts in the SERPs.

Other Factors That Cause Traffic Drops: A Checklist
Apart from reviewing Google Search Console and Google Analytics, there are additional factors on your website that may be the cause of a drop in traffic.
Low Quality Content
Google ranks web pages for how well they satisfy user intent. If the content on your web pages do not address or answer the user’s search query, then your page will struggle to rank as Google will have devalued for its given keywords.
Poor content may impact your traffic considerably if Google changes the SERPs as a result of a shift in the user’s search intent.
A quick Google search of the terms you want to rank for can help give an idea of what Google is rewarding.
This in turn allows you to improve your content so that it matches the search intent of the top ranking pages for your desired keywords.
Thin Content
Thin content is much the same as poor content, but in many cases, can affect your website’s performance more severely. Pages with thin content do not provide enough information to warrant ranking and so when they drop, traffic does too.
Thankfully, it is easy to see this by eye, and if you have a lot of indexable pages with thin content that do not drive much traffic, it is better to remove them completely.
A quick way to identify and remove thin content pages is to crawl the site using a tool like Screaming Frog or Sitebulb. These tools allow you to sort the pages by word count which you can then set an arbitrary limit (i.e. 500 words) to pickout the pages with thin content.
Duplicate Content
You cannot get penalised for duplicate content, and it is not grounds for action unless its intent is to manipulate the search results. Having said that, many sites that have duplicate content will have less desirable pages indexed if Google is left to consolidate the signals itself.
That is why it is important to mark up your content with the correct 301 redirects, canonical tags, parameters, and pagination to facilitate easy consolidation of content.
Key causes of duplication are:
- HTTP and HTTPS
- www and non-www
- Parameters and faceted navigation
- Session IDs
- Trailing slashes
- Index pages
- Alternate page versions such as m. or AMP pages or print
- Dev/hosting environments
- Pagination
- Scrapers
- Country/language versions
Note that duplicate content can also occur externally, where another website has copied your content. Some sites prefer to set up a DMCA certificate to safeguard their content and have external duplicate content on the web taken down.
Keyword Cannibalisation
Consider the following scenario. You have a web page that is ranking in position 3 and is bringing in 500 unique visitors every month. Suddenly, the traffic drops to just 50 visitors. What you didn’t realise, is that a new page that you added has now also started to rank for this keyword. This is where keyword cannibalisation can be extremely detrimental to your website’s performance. Earlier in the textbook we have explained how to find and fix keyword cannibalisation issues to prevent this from happening.
Loss of Backlinks
As backlinks are one of the most powerful ranking factors, it is usually quite a pressing issue if some have been lost. These may be because the webmaster who was linking to you has simply taken the link down, or the page now returns a 4xx error.
Either way, the link equity is no longer being transferred to your page, which is likely to have caused a drop in signals and therefore rankings and traffic.
A way to deal with this is to look at all the backlinks pointing towards your page and manually review the status code of these links first.
You can do this using the List Mode feature of Screaming Frog.

Ahrefs can also be an easy way to determine whether you’ve lost backlinks and offers reports for your lost backlinks or referring domains that will help you identify exactly why you may have lost these links for whatever reason.
For example, if you want to check for lost referring domains, navigate to:
Site Explorer > yourdomain.com > Referring Domains > Lost
This allows you to see which domains are no longer linking to your website for whatever reason.

You can then look to see if these links can be recovered by reaching out to the webmasters and asking them to reinstate the like.