SEO Audit: a large Travel offers aggregator

This is an excerpt from SEO Audit & Recommendations Report.

The structure of report may differ from the original one. This piece of content is featured through a license of a respective rights owner FOR ILLUSTRATIVE PURPOSES only. All names have been changed to keep the involved parties' privacy.


Redirect we implemented ineffectively. The analysis of 5 old domains shows that they share the same pattern: due to drawbacks in 301 redirect implementation on average 2/3 of the pages where not crawled at the new site, which results in 1) fewer PageRank passed to your new site, 2) less relevant content on the new site, 3) a drop in user experience and technical compliance of the new site. It has to be cured through updating the old domains’ sitemaps and specifically pinging Google of these updates.

Backlinks pattern unnatural; relatively low quality links. Obviously, a disproportional number of links (301 redirect are considered links too) come from a related sites that even share the same IP. I recommend gradually removing links from your network sites and changing them with quality editorial ones. I also recommend decreasing a number of redirects through removing 301 redirects from the main pages of the old domains (after the sites’ site map successfully updated). Also, low quality external links shall be removed through contacting respective site owners.

Internal links ineffective. Ineffective internal links structure prevents the PageRank from freely flowing to even the most top level pages. As a result, many destination related pages may have received a lower PR than otherwise would. I recommend featuring Travel destinations links on every site’s page.

Sitemap missing. In line with updating the old domains sitemaps, I recommend creating a sitemap for the new site for Google to most effectively crawl it.

301 (permanent) redirect

Please see a 301 redirect operation case study

Inbound linking is perhaps the most well known and discussed of the link structure elements. The possible factors that Google uses to determine the value of an inbound link are:

  • Broad use of keywords in links anchor texts. Understandably, many website owners don't control the text that other sites use to link to them. In case of your site, it never hurts to request a revision of the inbound link anchor text. Websites can suggest what the most appropriate way to link back might be. Never require certain text, but provide options for the creatively challenged.

  • Relevant sites link to relevant pages. A relevant website linked to the most relevant category pages on the website offers the highest value. A link from a page that has little or no relevance may harm or dilute the link value or overall website.

  • Natural links acquisition. Natural links to your site develop as part of the dynamic nature of the web when other sites find your content valuable and think it would be helpful for their visitors.

  • Ethical relations between sites. Choosing to accept links from other ethical websites will have a lasting effect on your site's rankings. Choosing sites that deliberately try to boost link popularity through link farms or other schemes to fool the search engines may lead to a drop in rankings or possible search engine penalties.

Obviously, keywords usage in the links anchors signifies a semantic relevance of the linked page, i.e. it says – hey, there is an interesting content featuring specific words on a linked page. Thematic proximity of the linked page is a good sign that a page belongs to a certain theme and may be relevant thereto. A sound link acquisition pattern shows that a page’s popularity is intrinsically organic, ie was acquired through natural process of content marketing, which means that it was liked by site owners of various tastes. And, ethical link building means that you subscribe to Google rules and won’t try to manipulate PR.

Coming from the last point, there are evidences that a non proportionally high amount of links has come from obviously related sites (same IP). So, to prevent issues of unethical relations between sites I highly recommend gradually removing links from related sites while substituting them with independent editorial links.

Importantly, the overall quality of the site’s incoming links has to be enhanced. According to Moz a number of linking root domains to your site is around 100. Google Webmaster Tools says that historically there were 893 domains that have links to pages on your site. While it’s a common thing that Google Webmaster Tools keeps all historical data, it is obvious that majority of those linking root domain has only supplied you with a temporal links.

So, there is also an issue with relatively low-quality directory or bookmark site links. As was stated in creating links that weren’t editorially placed or vouched for by the site’s owner on a page, otherwise known as unnatural links, can be considered a violation of our guidelines. Therefore, it’s a good idea to contact the sites’ webmaster and ask them to remove those backlinks.

To most effectively notify Google that the content on external sites has been updated (i.e. links removed), we shall need to ping it. There are a large variety of services on the web that notify search engines that a certain page on the web has been updated. The free ones:, The fastest is probably

Importantly, now there is a rather clear backlinks pattern with your site. So, currently the linking root domains broadly fall into the following categories:

  1. Wiki-like pages, for example

  2. Travel directory pages, for example

  3. Accommodation suppliers pages, for example

  4. Your network sites, for example,

  5. Various sites where there are spam comments, for example,

  6. Sites with paid or spammy links, for example,

Note, some links lead to 404 errors (total 2,088 pages not found).

Overall, you obviously lack backlinks from quality, high PR sites, except for your site’s network. Because accommodation suppliers pages – main source of your links – have a low PR on average. Besides, they really dominate your backlinks profile which seriously impacts your page’s expertise as perceived by Google.

Internal links are important signifiers of the weight a site owner attaches to a page. Also, a clear internal links structure ensures the PageRank can freely flow from high level pages (most linked) to the lower level ones, which maintains the overall site’s search engine performance. So, the more internal links a page has, the higher its ranking with SERP will predictably be.

A typical page of your site, for example, has 72 internal links. These are: Other Places menu entries, Individual listings of a page, Other pages within category, ie p=2, p=3, etc; as well as Quick links entries (bottom menu).

Compare this results to below (TOP 1 for search query). Its Cape Town Accommodations page has 352 internal links.

Here goes a Print Screen

Please, also see your Google Webmaster Tool for relative number of internal links each page gets (picture below).

Here goes a Print Screen again

Apparently, Cape Town page gets relatively high number of internal links ‘juice’, which influences its position with Google search results. Consider Kruger National Park page instead. Its internal linking is clearly not sufficient.

Usually, I observe a strong correlation between a number of internal links (see above), number of external links (see below) and a page’s Search Engine Ranking position. So, to enhance your SE ranking I recommend that you give more internal links to your travel destination pages as these are your most important landing pages.

To that end, my advice is to feature your Travel destinations links on a footer menu (see below).

Here goes a Print Screen again

Site map

Google Webmaster Tools detected no site map on your site.

A sitemap is a helpful tool that can improve the crawling of your site, particularly if your site meets one of the following criteria:

  • Your site is really large. As a result, it’s more likely Google web crawlers might overlook crawling some of your new or recently updated pages.

  • Your site has a large archive of content pages that are isolated or well not linked to each other. If you site pages do not naturally reference each other, you can list them in a sitemap to ensure that Google does not overlook some of your pages.

According to Google has currently some 19,200 pages of your site indexed. As you will update your other network sites’ sitemaps and ensure a proper page to page level redirect you will definitely need a sitemap for Google to keep a track of it.

Also, your sitemap can provide valuable metadata associated with the pages you list in that sitemap: Metadata is information about a webpage, such as when the page was last updated, how often the page is changed, and the importance of the page relative to other URLs in the site.

Search Appearance – Structured data

Structured data like for example Reviews rating, BreadCrumbs or Sitelinks (see below) does not impact your SERP but significantly impact a Click Though Rate (Impression to Clicks on your page Ratio). I recommend implementing Structured data on the site.

Here goes a Print Screen again

Last updated