1,277% Organic Traffic Growth: Technical SEO & User-Focused Case Study

Learn how we increased the monthly organic traffic of this online platform in the transportation sector by 1,277% from 936 to 12,894 visitors in just over 12 months.

Disclaimer: As a white label SEO agency, we keep the names of the websites we work on confidential to respect our partners.

Objective:

The primary goal for this campaign was to improve the accessibility of the website for search engines and elevate the experience for users.

Website History:

An online platform in the transportation sector based in the U.S.A.

The Main Issues Holding the Site Back:

This case study details the strategies implemented to address these issues and achieve significant improvements by focusing on technical SEO and UX optimisation.

  • Conflicting Indexing Directives: Inconsistencies between robots.txt directives and meta robots tags for blog posts created confusion for search engine crawlers, hindering proper indexing. A frequently crawled specific subdomain with an API significantly limited the crawl budget allocated to essential website sections.
  • Duplicate Content Issues: Multiple subpages with identical templates and very similar content potentially confused search engines about the primary content.
  • User Experience Optimisation: Clear categories were missing from the blog and navigation needed to be strengthened with the implementation of breadcrumbs.
  • User-Friendly Listing Presentation: Optimise individual pages to showcase useful information in a user-friendly way for both search engines and potential customers.

The Strategy

Implementing Clear & Consistent Indexing Signals

One of the key technical SEO requirements is to make sure that your content is easily accessible for search engine crawlers.

The client’s Robots.txt directives needed to be revised to reflect the client’s indexing preferences accurately. For example, a subdomain on the site containing hundreds of API-related URLs were excluded from crawling, freeing up the crawl budget for more important sections of the website.

A robots.txt file is a text document on a website’s server that instructs search engine bots on which pages they can crawl and index. It helps control what content appears in search results by specifying which areas of the site they do not want to be crawled.

Here’s an example of a robots.txt file for an eCommerce website:

User-agent: *

Disallow: /checkout/

Disallow: /account/

Disallow: /cart/

Disallow: /wishlist/

Disallow: /search/

In this example:

  • “User-agent: *” specifies that the rules apply to all web crawlers.
  • “Disallow: /checkout/” prevents crawlers from accessing the checkout pages.
  • “Disallow: /account/” blocks crawlers from accessing user account pages.
  • “Disallow: /cart/” prevents crawlers from accessing the shopping cart pages.
  • “Disallow: /wishlist/” blocks crawlers from accessing the wishlist pages.
  • “Disallow: /search/” instructs crawlers not to crawl the search pages.

Use our free Robots.txt generator tool which will automatically create one for you and learn more about the best practices here.

Implementing Canonical Tags

Canonical tags are HTML elements used to indicate the preferred version of a web page when duplicate or similar content exists across multiple URLs. Such tags were missing from the client’s website, which had numerous pages with similar, templated content.

Implementing canonical tags is important for SEO as they:

  • Prevent Duplicate Content: Canonical tags consolidate duplicate or similar content under a single preferred URL, preventing search engines from indexing multiple versions of the same content.
  • Consolidate Link Equity: They concentrate inbound links to a single URL, enhancing its authority and improving search rankings.
  • Improve Crawl Efficiency: Canonicalisation streamlines the crawling process by guiding search engine bots to the primary version of a page, enhancing crawl efficiency.
  • Enhance User Experience: By consolidating similar content, canonical tags improve user experience by ensuring visitors are directed to the most relevant and authoritative version of a page.

Here are some best practices to follow when implementing canonicals:

  • Use Self-Referencing Canonicals: Each page should include a canonical tag pointing to its own URL. 
  • Choose the Preferred URL: Select the most authoritative and representative URL as the canonical version to consolidate link equity and avoid indexing variations.
  • Use Absolute URLs: Specify canonical URLs using absolute paths rather than relative paths to ensure accuracy and prevent potential canonicalization errors.

For example, use <link rel=”canonical” href=”https://www.example.com/page” /> instead of <link rel=”canonical” href=”/page” />

  • Avoid Canonical Chains: Prevent creating a chain of canonicals by ensuring each canonical tag points directly to the preferred URL rather than through intermediary pages.
  • Canonicalise Pagination: Implement canonical tags on paginated content to consolidate link equity and prevent duplicate content issues across pagination pages.

Improving User Navigation With Breadcrumbs

As a platform with hundreds of location based pages for states and cities within the U.S.A., it was important to ensure that users were able to quickly and easily navigate around the website. We implemented breadcrumb navigation to help address this.

Breadcrumb navigation (aka breadcrumbs) displays a hierarchical trail of links, typically near the top of a webpage; helping users track their path from the homepage to the current page. For example:

Breadcrumb navigation

By implementing breadcrumbs, you: 

  • Enhance Navigation: Provide users with a clear path back to previous pages or sections, making it easier to navigate through the website.
  • Improve User Experience: By offering a visual representation of the site’s structure, breadcrumbs help users understand their location within the website, reducing confusion and enhancing overall user experience.
  • Reduce Bounce Rate: Clear navigation aids like breadcrumbs can reduce bounce rates by helping users explore more pages and stay engaged with the website.

Breadcrumbs can also contribute to SEO by providing additional internal linking opportunities, which can improve the discoverability and indexing of pages by Google.

User & Search Engine-Friendly Content Optimisation

With Google’s emphasis on providing helpful content it was vital that the individual listing pages on the site provided value and addressed what users were searching for based on their location. At the same time, key on-page elements such as the page titles and headings lacked optimisation for target keywords. 

When optimising content, we focused on:

  • User Intent: we identified questions and concerns that users may have for each location listing page and provided clear, easy to understand answers. By aligning the content with the user in mind, we were able to provide a more valuable and smoother experience.
  • Heading Structure: we broke up key pieces of information with a logical heading structure (i.e. using H2s and H3s for subheadings) to make it easier for users to find what they’re looking for and help search engines understand the structure of the content.
  • Internal Links: we added internal links to other relevant pages on the website, specifically, to nearby cities and states that may be useful for users. Doing so helps keep visitors on the site for longer and encourages search engines to crawl pages by following these internal links.

The Results

Since the start of the campaign in April 2023, the site’s monthly organic traffic increased from 936 to 12,894 sessions: an increase of 1,277%.

increase in monthly organic traffic

The site now ranks for an all-time high of 624 keywords within the top 10 positions of Google.

top 10 keywords increase

And ranks for an all-time high of 4,082 keywords overall.

overall keywords increase

Launch your organic search campaign

Drive your business forward through organic search. Our high-growth technical SEO strategies can make it happen

Blog

Always curious. Always learning.

My 6 Week Trip to SUSO’s Poland Office: Lewis Parker

I’m Lewis, the Head of Client Success here at SUSO. Here’s my little story of a trip to work in our Poznan office, discussing the work I focused on with our team, my exploration of the city, and also the wonderful people of Poland and their kindness in the face of a terrible war.

Read more
Must Have Content Writing Tools for SEO

5 Must-Have Content Writing Tools For SEO

We’ve put together a list of the 5 must-have content writing tools that’ll help take your SEO content to the next level.

Read More

What Does Company Culture Mean in SUSO?

Find out what it’s like to work at SUSO Digital and get an insight into our company culture.

Read More

Find out whether content freshness has a direct impact on search rankings and how often you should update content on your website.

Read More