hi people, i have a games arcade site and have noticed that many pages are not being found (404) when google crawls my site. apparently i can set up a robots.txt file which will stop google crawling these particular pages. how beneficial would this be?? and how easy is this file to create? thanks in advance clouting
Hi Dear Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Its a very Simple , First You need to create txt file Like , http://www.mydomain.com/robots.txt And read Below link for next steps, it will Guide you properly Which You are trying to do . webconfs.com/what-is-robots-txt-article-12.php I Hope, this will Help You. Thanks Jay
Robots.txt helps you control access to your domain. It's important in many ways for controling search engine robots. Search information about them in order to create an effective file
You can also create a custom 404 page which with links to the most popular areas of your site. The crawlers can follow these links.