Site review for an Irish online paper

SEO review and optimisation suggestion for an Irish online magazine

Summary

Goals

  1. Increase user traffic through empowering organic search users.

  2. Improve Behavior Metrics including β€˜Time of page’ and β€˜Pages/ Session’ .

  3. Improve the overall website’s Google AdSense performance .

Key research takeaways

  1. User engagement level is low.

  2. Content organization / grouping is ineffective.

  3. Keywords number is relatively low.

  4. Website has technical issues and content duplication problems.

Key suggestions

  1. Disallow crawling/indexing of an alternative domain version (com vs. ie).

  2. Stick to a single URL structure and use β€œrel=canonical” to signify what’s your preferred URL.

  3. Map different URL paths together to cure a high rate of 404 (page not found) errors.

  4. Adopt a fine Category -> Sub-category / Tag structure to better organize your content.

  5. Update the β€˜physical’ URL structure to mirror the structure above and give Google a better sense of attribution / connection between the content.

  6. Use Breadcrumbs to allow for better user navigation at a post level: Main -> Category / Tag -> Article.

  7. Update your page’s layout to allow for more effective internal navigation and distribution of PageRank.

Suggestions of further actions

  1. Fix the alternative domain issue

  2. Fix the concurrent URLs issue

  3. Create the robots.txt on the base domain

  4. Configure the .htaccess file to map the β€˜old’ URL vs. the β€˜new’ ones

  5. Create a map for content breakdown & attribution: category > sub-categories / tags

  6. Update the URL structure to mirror the map above for newly created content

  7. Produce mockups of new landing pages: category (sub-categories / tags) and post

  8. Develop widgets for supplementary navigation

  9. Update a basic XML sitemap and re-submit the website to Google index

Current metrics analysis

Organic traffic (Google) accounts for around 27% of all users. 47% of all traffic comes from social media (Facebook & Twitter). Interestingly, Facebook mobile users account for almost as much traffic as organic ones and have a potential to outperform organic very soon.

The average website’s user is rather loyal, i.e. 45% of daily visits are attributed to the returning users who visited your pages 3 times or more already. Now, the loyal users are something you may want to dissect and compare as it gives a sense of:

  1. What content is the sweetest?

  2. What is the most popular behavior pattern?

  3. What’s working for them the best?

Now, loyal visitors are into TV/Movie and Politics. The same concerns the new users, so it’s arguably your user segments’ characteristics that have to be addressed through a more focused content strategy.

Loyal visitors would use Desktop as much as mobile devices. New users would prefer mobile devices more. Obviously, it has to do with the fact that you are able to drive more [organic] traffic from Mobile than Desktop: your pages’ organic ranking position is 2 times better on Mobile than on Desktop (due to relatively effective mobile optimization) – see below.

In most cases loyal users would come to the site through Social (54%), 24% of them would use organic (Google) to type in the brand name mostly, and 16% would just come from bookmark (Direct). By contrast, only 30% of new users come from Social while 30% come from Organic. It means that currently your Social media strategy needs to be focused at attracting new users to the website vs. maintaining the community.

Notably, even the loyal users have low engagement rate with your content. It means that you cannot make them move further beyond one-page-visit-only: the β€˜through traffic’ from the starting page is rather low, i.e. 4-6% depending on the post. It has to do with supplementary navigation, i.e. other pieces of reading that may interest your users. The most popular routes (stacks of pages) were – see below.

Metrics analysis takeaways

Organic channels for user acquisition are underdeveloped. Organic users would mostly type in your brand, podcast and anchors-related search terms.

Importantly, there has not been detected any actual search terms pattern on your website: most search terms that you were ranked for are very time-and-events-sensitive (for example, derek davis [death]) or incidental /trivial (for example, sex video or Leicester city sex video).

Social accounts play an important role in maintaining the core of daily traffic, but they are relatively ineffective in both driving new users and developing loyal visitors’ engagement with the content.

Your page’s layout needs a quality supplementary content (i.e. related items, β€˜you might be also interested in’ or β€˜most popular in the category’) to be able to effectively improve engagement metrics.

Site SEO research

The whole picture of your current SEO profile is below. It particularly shows that the website has:

  1. Relatively low number of keywords (around 850).

  2. The vast majority of keywords are trivial AND ranked the second page of Google (11-20).

  3. Relatively good backlinks profile.

For comparison, http://www.irishmirror.ie/ has 7,000 keywords and 275k organic traffic vs. 12k of yours.

The core problem lies in ineffective content organization. So, what you have to do is improve Content organization & internal linking through updating the whole page template.

Now, the job suggested has to secure:

1. A clear attribution of each content piece with a category > subcategory (tag) through:

  1. the very URL structure, i.e. β€˜main/category/subcategory/humanly-readable-news-title’ for example like this http://www.irishmirror.ie/news/world-news/schoolboy-left-fighting-life-after-5833522

  2. the breadcrumbs navigation – see below: News > Weird News > Religion

c) use of tags to group the like content

2. Effective internal navigation through:

  1. Main menu supplementary navigation to most trending topics

  2. Sidebar navigation to the hottest news / most promoted content

  3. β€˜Related articles’ navigation that showcases similar articles in the [same] category

  4. Footer navigation that brings main content categories in an expanded format to get them accessed by crawlers more often

It’s important because it:

  1. creates citation of potential keywords on greater number of pages

  2. allows for a better distribution of PageRank, i.e. enhances chances of ranking of each single page on the website

  3. creates better user experience

For example, consider β€˜Derek Davis’ search term. Google normally would give precedence in ranking to websites that have more content on the search term (deemed more relevant) as well as render more quality content (expertness, supplementary content, number of backlinks, etc).

Now your website has around 230 pages featuring that term: https://www.google.ie/webhp?hl=ru#q=site:www.newstalk.com+derek+davis whereas http://www.irishexaminer.com/ has 15 times more pages: https://www.google.ie/#q=site:www.irishexaminer.com+derek+davis.

What’s important to understand is that through proper content organization and navigation http://www.irishexaminer.com/ wins the battle off the lower number of original pieces of content – it would just better use the supplementary navigation machinery.

In addition to content organization and navigation (internal linking) your website has some issues with the technical side of SEO, i.e. crawlability, missing and duplicate content. It has core importance as to the overall organic ranking ability.

Crawlability and indexing

Crawlability and indexing is the compliance to Google tech requirements regarding ability to access/crawl/index the pages as well as non-duplication.

The website is accessible under two different domains https://newstalk.ie/ vs. http://www.newstalk.com/. It may cause a content duplication issues that make it harder for Google Algorithm to figure out what page to show up in search results. Normally, large scale of duplication breaks the credibility of tour pages and results in poor indexing of your content.

Suggestion: pick one domain version and disallow ALL other version from indexing by Googlebot.

Here’s what you need to do for https://newstalk.ie domain:

  1. Create a robots.txt file in your document root

  2. Add the following statement to it (means disallow all crawlers):

  3. User-agent: *

  4. Disallow:/

  5. Submit robots.txt via Google Webmaster Tools

  6. Remove the domain from your webmaster console

Another issue is preferred URL structure. It means that on your β€˜base domain’, i.e. http://www.newstalk.com/ there are 2 kinds of URL structures in place: http://www.newstalk.com/reader/47.301/44495/0/ vs. http://www.newstalk.com/Kildare-Newbridge-houses-fire-court-Millfield-Manor-man-arrested. You have already put a redirect for browser users; yet, Googlebot can still see both URLs as it does not use a browser to access your pages. It causes another round of duplication issues. As a result, you have more than 10,000 duplicate page titles & obviously pages content.

I recommend that you used β€œrel=canonical” Meta tag to show Google the proffered version of your page (the one you want to appear on search results). It can be done through installing ready-made extensions (widgets) to your CRM or through coding your files β€˜manually’.

Server connectivity and missing page (404) errors

The Google Webmaster Tools shows that recently Googlebot’s experienced problems with accessibility of your web pages on the server. Obviously, your script cannot process search queries of the kind of β€˜search.php?search_term=ireland&currentpage=1’, i.e. the search option is broken. Suggestion: fix or remove search all together.

You have a relatively high number of 404 errors, i.e. your page has been removed and is no longer accessible from the old URL address. As a result of lots of 404 errors your web pages PageRank (PR) cannot normally circulate. The PR is an index historically used by Google to determine the relative probability of a page's ranking in organic search. It is said to be gained by counting-in the β€˜quality’ of incoming links and is distributed through the internal linking across different pages. Likewise, if a page get a β€˜404 error’ the PR would not flow to the relevant pages within the same category but at best would be channeled to the β€˜default’ pages (Home).

404 errors originate out of two concurrent URL structures, i.e. β€˜/reader/ based’ vs. β€˜human readable URLs’. Apparently, these are produced by operation of your Content management system (CMS).

So, the first thing to do is stop producing alternative URL paths to the same content. Alternatively, specify a canonical URL path for β€˜/reader/ based’ URL. I recommend that Google would only index β€˜human readable URLs’ – those will be canonical ones.

Afterwards, you will need to map the β€˜/reader/ based’ URL vs. β€˜human readable URLs’. In my observation, most of 404 errors have the same pattern: inaccurate mapping of the concurrent URLs. Likewise, http://www.newstalk.com/Passenger-removed-from-plane-over-notebook-doodles gets 404 errors while http://www.newstalk.com/Passenger-removed-from-plane-over-notebook-doodles- gets a user redirected to a β€˜/reader/ based’ URL (no 404 error).

Suggestions of further actions (repeated):

  1. Fix the alternative domain issue

  2. Fix the concurrent URLs issue

  3. Create the robots.txt on the base domain

  4. Configure the .htaccess file to map the β€˜old’ URL vs. the β€˜new’ ones

  5. Create a map for content breakdown & attribution: category > sub-categories / tags

  6. Update the URL structure to mirror the map above for newly created content

  7. Produce mockups of new landing pages: category (sub-categories / tags) and post

  8. Develop widgets for supplementary navigation

  9. Update a basic XML sitemap and re-submit the website to Google index

Last updated