Hidden Road Blocks
Ever wondered why your website doesn’t seem to be attracting any visits or as many as you expect? You might be amazed at how many hidden pitfalls could be lurking in your website’s code that could put the kibosh on your plans to dominate the world.
This is a post for small and medium sized businesses to highlight some of the risks of not having an SEO audit carried out on your website, domain, backlinks and server set up.
I’ve encountered 3 road blocks recently which turned out to be very easy to circumnavigate. However, if they had not been seen in the first place, then they would probably have permanently impaired the performance of some aspects of each website’s performance.
SEO can be perceived by people outside the industry as being about little more than keywords and back links. But in reality the breadth and scope of the subject is mind-boggling and that’s why it has such enormous potential and also why it’s so much fun.
Example 1: Crawler Access Inadvertently Denied By Robots File
When we reviewed Bytes website recently, we found that the robots.txt file was preventing spiders from crawling one whole section of the site. This was no fault of anybody in particular and arose due to a peculiar coincidence between the choice of name of a subdirectory within the site and an existing command in the robots file.
This means that, as far as the search engines were concerned, that subdirectory and all the pages and all the content within it, simply did not exist because the crawlers (aka spiders or robots) had been prevented from accessing the code.
Simply renaming the subdirectory (and of course setting up 301 redirects from the old addresses to the new ones) cured the problem instantly. Within a day or two, pages like these were appearing on page 1 of Google (google.co.uk) for obvious search terms.
http://www.bytes.co.uk/info/licensing-updates/microsoft-price-increases-from-1st-july-2012/
http://www.bytes.co.uk/info/licensing-updates/microsoft-price-changes/
The nature of the content on the pages is time sensitive so sorting it out brought benefits, in the form of hundreds of visitors, that would otherwise have been lost.
Note:
If you or your customers have an inventory of Microsoft licenses, Microsoft confirmed last week that volume licensing prices would rise from 1 July 2012 by between 1.7 per cent and 25.9 per cent for the UK private sector.
If you’ve been caught napping, 25% is not to be sniffed at, so you should not delay any further if you don’t want to miss the deadline. Talk to the UK’s leading microsoft licensing experts as soon as you can.
Example 2: Crawler Access Blocked By Java Script Menu
In this example, the main navigation menu for The Wholesale Glass Company website was encoded using Java script. Some of the code rested in a subfolder blocked by the Robots.txt file.
However, there was an HTML sitemap which was crawlable and therefore helped to ensure that all the pages and the content on them were indexed, so what’s the big deal?
The problem was that each of the most important pages on the site, the product pages like this one about acoustic glass, had only one internal link to them which was the link from the sitemap. Some less important pages had links from every other page on the site in the header of those pages. The number of internal links pointing at a page is one signal Google uses to interpret the importance of that page relative to other pages on the site. Therefore the non-crawlable internal links were contributing to the most important pages rankng less well than they should have done.
Replacing the code for the main navigation menu was a simple matter and rankings for the product pages improved in less than a week by typically 20 to 40 places.
Example 3: 301 Redirect Set Up Incorrectly
This one is embarrassing but I like to be open!
When carrying out a few experiments (see my earlier post on Google’s Venice update), I deleted this page:
and replaced it with a new page with largely modified content here:
http://www.adjuice.co.uk/camberley-seo/
Following the Venice Update, the first page above started ranking at number 5 on searches for “SEO” and “SEO Services” to a searcher in this locality. I wanted to try and determine what factors might be at play and wondered if deleting the page would result in the home page ranking in the same position as the deleted page had. After all, the site has the Camberley address on every page and on many third party sources like our Google Place Page, so why shouldn’t another page still be highly relevant?
Weirdly enough, the home page did but only if I set my location to one of the neighbouring towns, for example Guildford. If I set my location to be Camberley, the home page did not appear. I still haven’t worked that one out.
Next step was to create the new page, highly optimised for Camberley. This didn’t work. That is until I discovered, on checking 2 days ago, that I had not correctly set up the 301 redirect from the old deleted page to the new page! 🙁
I corrected this and within a few hours, the new page re-appeared at number 5 on a search for “SEO” (see image below) and also at number 5 on a search for “SEO services”. No pages of the site ranked in the top 30 previously for just “SEO” (I have never targeted that term). The home page previously was ranking on page 1 (at number 10) for “SEO services” but, hey, I’ll take another 5 places if they’re up for grabs for free!
Rankings for other search terms like “SEO companies” did not trigger the new page to rank in place of the home page which I found slightly surprising.
Click on image for full size version.
This discovery nicely rounded off the experiment in the aftermath of Google Venice. Whilst it is not possible to be certain it does look like a couple of key factors to ranking higher in the localised organic search results are to have (1) a page with content that is highly relevant to the search query and (2) for the searcher to be in reasonably close physical proximity to the location implied by various characteristics of the page, including content.
One might argue that the business address of Camberley is a major contributory factor to the Camberley page ranking well but this does not apply to two other pages on the site that rank equally well in the same circumstances for searchers based in Berkshire and Hampshire. This may mean that sites with large numbers of doorway pages can still be effective using these tactics but I suspect (no proof here) that since the Panda Updates, they would probably need wholly unique content on every page.
Anyway, I’m getting a bit off topic now, so back to the plot.
The real message from the above three examples is that if you haven’t had an SEO audit carried out, you may never know what you’ve been missing! All 3 were easy to fix and all 3 fixes should generate more visits.
Related articles:
After I had started writing this post, this one dropped into my inbox and is a good write up on SEO audits if you want to learn more.
How to Perform the World’s Greatest SEO Audit by Steve Webb on the SEOMoz blog.
Please feel free to kick off a discussion below. You can find all our contact details here or more about our SEO on our home page.
Come back soon for our next post or, better still, why not subscribe using the RSS button in the footer to get automatic notification of all future updates?