Google wtf 2: Dynamic URLs vs. static URLs
On Monday a post appeared on Google’s Webmaster Central blog that was so gobsmackingly strange I’ve had to let it sit for a few days to make sure my post-weekend reduced cognitive function wasn’t the reason I didn’t get it. I’m talking about Juliane Stiller and Kaspar Szymanski’s points on dynamic URLs vs. static URLs.
The post outlines a bunch of reasons why website creators shouldn’t create static versions of dynamic URLS (i.e. changing site.com/product.asp?id=99&brand=1 to site.com/sony/walkman). Among the reasons not to use static URLs they listed the following:
- They’re hard to create
- They’re hard to maintain
- It’s better to let Google guess which parameters are important
- If you want a static URL you should create a static equivalent of your content
The hard to create and maintain part might be true of some webmasters but I think the majority would greet that statement with a hearty “What the?!”. Most developers will have at least one tool in their box for URL rewriting, typically mod_rewrite when developing on a LAMP stack, though ASP has ISAPI_Rewrite and just about any web scripting language you care to name has functionality to write static (or semi-static) files to the file system. Static seeming URLs have been recognised as best practice for years, it’s not like this is bleeding edge technology developers are just getting to grips with.
As for letting Google guess which parameters are most important, I don’t even know where to begin with that. Does the Google Search Quality team really think they know my site better than I do? They’re good, but until they plant that neural chip Sergey is no doubt working on and start indexing the content of my brain that’s just not going to happen.
As for creating a “static equivalent of your content”, isn’t “static equivalent” just another way of saying “duplicate content”?
If that wasn’t enough reason to be scratching your head at the article there’s also plenty of factors they don’t mention which make excellent reasons for using static-seeming URLs:
- Improved rankings. Even if Search Engines drop URLs as a ranking factor it’s still going to help having a URL that’s keyword rich
- Security. I’m no fan of security through obscurity but given the choice to obfuscate or not I’ll do it, all other things being equal
- Convenience. Static URLs tend to be easier to pass around over IM or email.
- Portability. If I decide to change my scripting platform from ASP to PHP (or whatever) and don’t have full access to the web server I have to change all my URLs. Not if they’re static.
And the #1 reason we, and everyone else, should ignore this advice from Google:
- Usability. The URL is often the first thing a visitor knows about the page and they appreciate the early clue as to it’s content. Static URLs engender more user trust and goodwill than dynamic. People just like them. So if you like people, go static.
Oh, and the URL the post appeared on? http://googlewebmastercentral.blogspot.com/2008/09/dynamic-urls-vs-static-urls.html.
Bizzare.
23 Comments
Damien van Holten - http://www.reaact.net/blog/
A very strange post from Google to say the least. Once again they make the same mistake: assuming people make websites for Google.
Sure I do SEO, but I create websites for visitors not Google.
Daniel Mcskelly - http://www.bronco.co.uk
My thoughts exactly Damien. I’m a big believer in “optimize for humans and engines will love you too”. It’d be a real shame if Google starts imposing conditions that detract from the user experience just to make things easier for Googlebot.
Florida SEO - http://www.edwardbeckett.com/
Web Sites Are for Visitors …
Search Engines Bring Visitors …
Pretty URL’s … Are Better For Rankings and Visitors …
As to the post by Juliane Stiller and Kaspar Szymanski …
I’m ignoring them …
g1smd
I think the intention was to dissuade beginners from doing rewrites, as done badly they can make things worse… There are plenty of live examples out on the web that amply demonstrate that point.
For those that really do know what they are doing, are aware of all the risks, and properly test their work with a wide range of expected and *unexpected* URL requests, then carry on as before.
g1smd
I think the main intention was simply to dissuade beginners from doing rewrites, as done badly they can make things worse… There are plenty of live examples out on the web that amply demonstrate that point.
Even a great many of the various forum, blog, and CMS packages – even the most well known and popular ones – are full of these sorts of problems, and various “SEF URL packages” don’t always fully address the problems.
Maybe in those cases Google would have preferred that the designers had left things well alone, as by implementing rewrites badly they have made things far worse for crawlers, not better.
For those that really do know what they are doing, are aware of all the risks, and properly test their work with a wide range of expected and *unexpected* URL requests, then carry on as before.
Peter Young - http://holisticsearch.co.uk
Completely agree Dan. THink they have also forgotten to mention – we don’t want you to GUESS what is the most important aspects – we would like to advise you what they are. Less open to differences of opinion that way.
@g1smd – in that case I wish they would stop writing blanket posts aimed at the lowest common denominator
Jim McNelis - http://ditoweb.com
could there concern possibly be multiple static urls pointing to the same dynamic url?
In other words, static urls for your 1 dynamic page on viagra
g1smd
*** Could their concern possibly be multiple static URLs pointing to the same dynamic URL? ***
Undoubtably. There are very many sites with incomplete or botched implementations.
Adrian Berry
I was perplexed too with this one Dave – the url rewrite for the article kinda sealed it for me! Plus if static urls are not that important why is G highlighting the keyword terms in the listings. A quick search on the term “seo training” shows the first 4 listings are keyword rich with the term highlighted. Not important eh?
Vinay - http://www.juretic.om
From Google Webmaster Guidlines
Quality guidelines – basic principles – “Make pages primarily for users, not for search engines.”
&
Design and content guidelines – “If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages.”
Now.. WTF? Do they even check their own Guidelines before posting the Blog Article?
And now they come up with a Blog Post requesting webmasters not to switch to Static URLs over Dynamic!?
“It’s much safer to serve us the original dynamic URL and let us handle the problem of detecting and avoiding problematic parameters.” – You gotta to be kidding? You could have placed this in your Webmaster Guidlines rather than making a Blog Post 🙂
Something is Insane here 😀 !! Please continue your “Google WTF xx:” series 😀
g1smd
The guidelines are older pages, obviously yet to be updated.
I guess the wording there will change in the coming weeks.
Richard - http://www.webhelp.co.nz/blog/
I have to agree, the post from Google was a bit of a curve ball. Bit hard to believe that they would post something like this.
I think it was Googles way of saying if you have a dynamic URL structure that we (“google”) won’t have an issue crawling & ranking your site.
I’m sticking with the whole static url structures, theres so many positives in doing it that way
Floogy - http://floogy.com
Vinay said: “Design and content guidelines – “If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages.”
Perhaps that’s why they would rather we use static pages. With over 1 trillion indexed pages, web site owners now make up a large percentage of www users. If we don’t rank well in every search engine, which will we like best and therefore recommend to our peers?
Perhaps the low road of emphasizing other’s potential weakness instead of focusing on oneself is catching on in the corporate world thanks to Apple and the current presidential campaign.
nicg - http://www.luxuryworldwideholidays.co.uk
I hope you guys are right – I’ve just spent 2 days, and finally cracked mod_rewrite!!!!!! My site will definately be more attractive to humans 😉
Phil Buckley - http://seo.1918.com
The main thing I took away from the article was that there’s no reason sites MUST rewrite urls, that Google doesn’t do a worse job with dynamic urls.
After that, the rest is just the opinions of the writers.
Frank Reads, 25 Sep 2008, Back From SMX | Gadgets, Games and SEO - pingback
[…] – DaveN complains Google wtf 2: Dynamic URLs vs. static URLs; […]
Google Promotes Uncool URLs | Hobo - pingback
[…] More discussion at SERoundtable, Webmasterworld & at Search Engine Land and now at DaveN. […]
Ophir Cohen - http://www.compucall-usa.com/2008/09/24/google-sees-the-glass-half-empty-dynamic-url-vs-static-url/
Definitely weird post by Google. We also submitted our view here http://www.compucall-usa.com/2008/09/24/google-sees-the-glass-half-empty-dynamic-url-vs-static-url/
It seems as this post is intended to novice webmasters. Actually for them it’s probably best…
A Playoff Worthy Lineup Of Links - This Month In SEO - 9/08 | TheVanBlog | Van SEO Design - pingback
[…] Google wtf 2: Dynamic URLs vs. static URLs […]
TomA
I’m certainly not changing my URL structure for a search engine.
How to use wildcards in robots.txt - pingback
[…] been quite a large reaction to Google’s announcement that you don’t have to rewrite your URLs to […]
How to use wildcards in robots.txt for sites that use dynamic query parameters - pingback
[…] been quite a large reaction to Google’s announcement that you don’t have to rewrite your URLs to […]
REBel66 - http://www.poshcuffs.co.uk
I have converted my site http://www.poshcuffs.co.uk from dynamic urls to static urls such as http://www.poshcuffs.co.uk/cufflinkpages/545-spitfire-cufflinks.php.
The first consequence is that google threw out all my listings on the dynamic urls and did not replace them all (about 1/3) with the static ones. Later on if I edited the content on the static pages they often just popped on to P1 the next day.
The static URLS definitely rank higher than dynamic, without question.
I don’t use Mod_rewrite for this, I think that’s a way to guarantee mistakes, I wrote functions to generate the URL’s, where these urls are presented as links the same function generates them as writes the file name. That way there’s less potential for mistakes.
The idea that Google should decide what parameters are important is nonsense.
IMHO parameters are for variables that you can’t predict only, like search text. Anything that is fixed or chosen from a list is by definition static.