hi, i've noticed when running a site through a simulator that the cgi links i have are not seen. does google etc not recognise urls which include the 'cgi' in them? is it better to have hard coded links? thanx
Google can crawl and index dynamic URLs (ex. php, cgi, etc), it's the other search engines that have so called problems with them. I've had many sites (and still do) that use dynamic urls and index just fine in all 3 major search engines, the trick is to make sure the dynamic URLs don't change.
i think it's not related google because google can follow dynamic urls. is there anythink which may block google to crawl those pages like robots.txt or meta tags?