My website used to have scripts in the cgi-bin which I have since converted to PHP and moved elsewhere. But, I have several thousand URLs from this cgi-bin in the search engine indices, each with different parameters. I think these URLs are stuck there because the cgi-bin is now Disallowed in my robots.txt, so the bots can't see what's changed. Should I: Allow the bots in, and then return 404 for each URL? Allow the bots in, and then redirect them to another page? Simply stop worrying and forget about the whole thing? Any suggestions greatfully recieved! Cryo.
he Guys !!!! pls dont mind "I wud like to know what is robots.txt file and how to restrict bots in robts.txt file? and also how to do "returning 404 for each URL" as i am new to SEO. If u could expl it will b a great help to me. Thanks
robots.txt - http://www.searchtools.com/robots/robots-txt.html 404 is the page that shows up when there is no file there. Simply make a 404.shtml (for my host at least), put whatever you want inside & upload to your root folder.