It was during a quick beer this afternoon, that my good friend Ryan Mckay told me about James Breckenridges post regarding the Google Webmaster Tools bug. For those that don’t know, there appears to have been a bug that allowed any verified Google Webmaster Tools user to remove pages, even if the site in question wasn’t under their direct juristiction. By simply typing in the following URL you could wield power many negative SEO’s only dream of:
By doing so you could block a whole site, section or single page, based on how you entered the URL. To block a site, use the top level domain (E.g. http://www.someurl.com/), to block a section (subfolder) use a subfolder URL (E.g. http://www.someurl.com/somefolder/) and to block a page use the specific page URL (E.g. http://www.someurl.com/somefolder/somepage.html). As the Compare the Market meerkat would say – Simples.
BingoplayUK removal request
Now the fact an unscrupulous user could do this is worrying enough – and whilst permanent deletion is unlikely due to the way these removal requests work this could still have had catastrophic events for any organisations which relied heavily on organic search for traffic. Just imagine if this has hit a site such as Moneysupermarket for example who dominate the UK marketplace for many high volume, high cost, terms such as mortgages, credit cards and car insurance. The exploitation of such a loophole could have been catastrophic.
Now imagine that self same organisation held all your personal data. Oh – wait with recent developments that is a real life scenario. What is really worrying to me is the simplicity of the loophole, a simple change to the query string and one could make merry hay. No checking or validation against user data. As long as they were a Webmaster Tools user that was good enough. That to me is a very worrying scenario, and to be honest one that makes me very conscious of the amount of data I am happy to give Google.
This may well just be a reality check. We have seen a lot of development from Google over the last year or so, to the point that it seems that not a day goes by without another tweak, change or test. With that has to come a degree of danger in terms of ensuring that all the i’s are dotted and t’s crossed.
I have to commend Google on the time it has taken for them to react to the loophole, certainly by the time I was doing the State of Search Webmaster Radio show with Bas Van Den Beld and Roy Huiskes the patch had been fixed. However if this was a simple oversight, I seriously hope its a one off and perhaps a wake up call that it only takes one mistake to see it all come crashing down.