Im currently re-doing my site, http://www.lazarusdesigns.co.uk slightly so that it displays differently based on resolution. Im using javascript to detect the res then reload the page & store the res in a cookie for the rest of the site. Does anyone know either a) a better way to do this? or b) whether using the javascript:location function to reload a page with an extra parameter is likely to annoy any spiders?
Spider cannot read javascript.. put your scripts to another file.. Use <script type="text/javascript" src="the name of your js file">
Yeah i realise that, i just dont know if the actual process of reloading a page with an extra parameter of &res=hi the first time someone accesses my site will be thought of as 'dodgy' by any search engines...
Since search engines can't run the JavaScript they won't be redirected to the same page but with the parameter. So you are fine with it
FWIW, I've seen Google spider a link that was embedded in <script> tags within an HTML page. So it may not execute the javascript, but the GoogleBot will happily take what looks like a URL and try to load it. I've verified this directly by analyzing logfiles from the following code: HTML page contains: <script> testImage = new Image(); testImage.src = 'myimage.gif'; </script> Code (markup): Logfiles show that GoogleBot has linked to "myimage.gif". How d'ya like that? LC88
Yes, it would appear so. Minstrel, have you seen any other threads/blogs/info about this? I'd like to research it a bit further...... thanks LC
When googlebot version 2 came out (this past spring was it?), there was quite a bit of discussion about it on various forums, perhaps this one included. I can't give you actual links but I'm sure a Google search would dig them up.
Here it is: http://forums.digitalpoint.com/showthread.php?p=31485#post31485 There's also a thread here: http://forums.digitalpoint.com/showthread.php?p=31485#post31485 I'm sure there were similar threads on other SEO forums.
Interesting... I may just have to write a test page with variously-obscured URL's within JavaScript, and see what the logfiles show for GoogleBot activity.
actually, I meant the experiment would be to learn more about the GoogleBot... I don't really care if/whether they index the test content. Alot of rumours have circulated about G starting to evaluate JS, but there's very little detail.