BMW blacklisted by Google
I wrote this post a while back. The content can still be relevant but the information I've linked to may not be available.
The BMW group has been blacklisted by Google for using 'doorway' pages in order to influence search results. Naughty.
The story was first reported on Matt Cutts' blog. This is a useful website if you want to keep up with the latest Google news from a Google insider. Matt Cutts also reported that a prominent camera manufacturer will be blacklisted for similar reasons.
'Doorway' pages are pages that have a high keyword text content (for search engines) but redirect to another page (for regular users). In a lot of cases the final destination page is 'more attractive' in terms of images but less effective for search engine listings.
It's quite surprising that BMW were caught like this. After all, the technique is quite well known and usually regarded as a 'black-hat' SEO technique. I guess the Head of SEO at BMW is now out of a job?
Comments
13 Feb 2006 19:33:22
It’s quite surprising that BMW were caught like this. After all, the technique is quite well known and usually regarded as a ‘black-hat’ SEO technique. I guess the Head of SEO at BMW is now out of a job?
What would have surprised/delighted me was if Google and its filters were to automatically detect the spam presented by BMW. I still see well ranked sites using redirects, client sniffers and other ‘black-hat’ techniques so chances are BMW were dobbed in, not found out.
As you mentioned in a previous post, there are superior alternatives to Google as a search and resource utility; only just as susceptible to spam however.
Search2.0 will be interesting…
15 Feb 2006 15:45:12
I must admit that I assumed that Google could detect redirects of this type. If not, why not…
15 Feb 2006 19:19:47
Although rumours were around since 2003 that Google would parse simple scripts, truth is I’ve tried to force Google through a variety of scripts and it simply refuses.
Since JS is primarily a behavioural layer, I guess it serves no purpose for search engines to understand script, other than to improve the quality and detect doorways etc. The problem with that is how many ways can you build up a string and pass it to location.href to dodge Googlebot?
If Google were to understand script then I guess location.href would detect many doorways and document.write would grab new content.
Somehow I think it opens too big a can of worms for them…
Comments are OFF for this post.