September 1st, 2008 Add Your Comments Bookmark and Share

Having a website design company design you a custom content management system is becoming quite the thing to do these days. Whilst the initial costs are much higher than that of a static website many small companies prefer to have the option of updating and creating new content at will, both major advantages of using a CMS over traditional static website development. The problem is though are you getting value for money, with even an average bespoke system costing upwards of £6000 it’s very important that your website performs. A lot of website designers and PHP programmers aren’t even aware of the impact a badly designed (in terms of SEO) content management system can have on their clients potential rankings. As you’re the customer it’s up to you to make yourself aware of the pit-falls and how to avoid a “bad” CMS.

3 SEO Checks To Make On Your Custom CMS

1. Search Friendly URLs – Most CMSs work by querying a database with a set of parameters and populating a templated page depending on that query. As a result the URL you see in the navigation bar will often look similar to Search engines do not like long query strings and are quite partial to just knocking them off. You could end up severely limiting what pages get indexed by the search engines. A smaller list of indexed pages more likely will result in less “free” organic search traffic. The easiest way of fixing this problem is to have your web design company use the URL re-write rules that come with using a Linux/Apache platform to create search friendly URLs. For the example, the above could end up looking like (just an example, in reality it wouldn’t be hard to make this . No query string, nothing for the search engines to dislike. If you are going down the IIS and ASP/ route then there are workarounds, they are just a little bit more involved.

2. Duplicate Content – The search engines (well Google mainly) hate duplicate content, it’s in your interest to only ever feed them 1 copy (under 1 URL) of each unique page of your website. Luckily there is a ready made mechanism for making sure this happens, it’s called the robots.txt file. The file is read by the search spiders/robots and they use it to tell them what they can or can’t trawl from your website. When this matters with custom content management systems is that developers have a tendency to provide multiple paths to the same content (be it categories/ tag clouds or just a simple search function). Whilst this is great for visitors you have to make sure that the search engines only get 1 copy of your content.

3. Only Make Secure What Needs To Be Secure – When you are having your CMS written it’s always nice to have a secure area, maybe it’s for staff or registered users? The problem is that the search engines can’t spider anything that requires a login (99% use cookie authentication and the search robot can’t pass a cookie to authenticate itself). It’s very important then that when at the design stage that you make it clear that you want as much content as is possible in the open public part of your website. You’d be amazed what sort of content can draw organic search traffic, don’t limit yourself by hiding content away that doesn’t need securing.