Thursday, May 22, 2008

SEO: make your own luck...

A version of this piece was published in Marketing in 2008

Richard Wiseman at the University of Hertfordshire has spent eight years looking at what makes people lucky. We’re not talking rabbits’ feet and avoiding ladders here – he’s devised four principles that determine a person’s likelihood of success.

Some of the veer a little towards the obvious; ‘Maximise your chances of something good happening by creating, noticing and acting on opportunities’ he says – which seems a little like saying you can avoid the misfortune of sinking, by swimming.

But at least we now know that we really do make our own luck. And nowhere is this more true than in natural search.

Search marketing has become a huge business in the UK. We’re Google’s second biggest market, and search alone will represent just under 10% of all UK advertising this year. This might seem big, but the real search market is five times that size.

Because 80% of traffic comes from the natural results – in Google, the listings below and to the left of the paid-for ones.

And these ‘natural’ or ‘organic’ listings can’t be bought. Instead, your position in the rankings is determined by the relevance that the search engine’s algorithm judges your site to have.

So with such a huge volume of traffic to play for, you’d think it’d attract a lot of attention from marketers.

But search engine optimisation (SEO) – the way of manipulating sites to improve their ranking – is fraught with difficulty. Traditionally the unaccountable face of search, it’s gained a reputation for impenetrable jargon (even for digital media) and unethical practice, and many sites don’t realise the influence they can have on their ranking.

Often, marketers simply aren’t aware that SEO is needed, thinking it comes ‘in the box’ when they buy their website. But the skills and preoccupations of web designers are very different to those of SEOs –concerned with design, copy, usability etc., whilst SEOs focus on metadata, tags, redirects and links, and their super-niche skills change constantly to reflect the hyperevolution of search.

In other words, websites are usually designed for people, and many ignore that other vital audience, the spiders that index sites for search engines.

These spiders see websites very differently. Animation and images are often invisible to them, they need clues to help them understand site structure and if you’re not careful they can easily misinterpret your efforts.

Take the 301 Redirect. Lots of sites have both the and the .com web address, but only one site – you type in one and get redirected to the other. Google’s spider looks at this, and assumes you’ve got two sites with the same content – a common technique for trying to fool the search engine into ranking you higher. So Google’s algorithm will penalise your site for this – pushing you down the ranking.

The solution is simple. The redirect needs to be a particular type – a 301 Redirect. Doing this makes no difference to users, but tells Google you’ve only got one site – meaning you don’t get penalised.

There are hundreds of techniques like this, and properly implementing them can impact hugely not just on the volume of traffic you can get from search, but on the quality of your listing – one advertiser went from 500 to 23,000 referrals a month on one keyword alone, just by implementing a proper SEO programme.

But nowadays, effective SEO also impacts on paid-for search. Google takes the quality of your landing page into account when determining your ranking in paid search, adjusting the minimum bid downwards if it deems site quality to be high. So a poorly-optimised site might have a minimum bid of 15p, whilst a well optimised site could be 10p – meaning an SEO programme could pay for itself just in savings on paid-for search.

With this much value at stake, we can’t afford to let search happen to us. It’s time for sites to throw out the rabbits’ feet and start making their own luck.