I've never really worried too much about using query strings versus setting up subdirectories to limit the size of the the query string. However, the following shows what both Google and MSN are doing when spidering one of my sites. Get /?c1=canada&c2=ontario&c3=trento Get /?c1=canada&c2=ontario&c3=kingst There are letters missing at the end which makes the query invalid. So, empirically, it looks like Google has a 30 character limit (maybe only by default) including the ? character. Has anyone else noticed this issue? I was suprised to see MSN doing exactly the same thing. I almost wonder if they are sharing sitemap data or something, as this site has a simple text sitemap that contains the correct URL's to spider... but it isn't. Heck, I might even be better off to drop the sitemap if it's going to miss a deeper layer of content while default spidering might not. Any thoughts?
My URL's for GenerationTalk or Arcon5 aren't particularly user friendly - although this may change soon but MSN and Google dont restrict my character limit, maybe its just an internal error with your website.
I have never really noticed this, but then again, I do try and keep my urls as short as possible just so that my visitors can remember them more easily. Infact I use and have been using mod rewrite for sometime now so all my urls are search engine friendly. Well nearly all of them anyway.
Hmm, I may have to vote on Internal error... it's possibly the logging is only showing so many characters... I hope so... it would save me from having to redesign things!