Hi guys, If i want to create a robots.txt tht will allow all search engine to crawl my site at all pages, is this content correct? Pls advice....thanks man! User-agent: * Disallow: Sitemap: http://www.mysite.com/sitemap.xml
all rite man ! i got it. i will do it now! By the way, wht will be the impact if i not create this robots.txt for my site, but i only submit my sitemap to google. Wht is the impact?
if you don't give any kind of command to robot and submit your sitemap in google then crawler will automatically crawled your all pages.
Well; what do you actually want to do? you're talking about submitting a sitemap, which means that you want your pages crawled and indexed, but at the start of this thread you use a robots.txt file with a 'Disallow' command in it. That stops the bots from crawling certain pages, directories etc...
no...u get me wrong. i am seeking for advice whether my robots.txt content content is correct or not? if i want search engine search all my site pages... Is this correct ? User-agent: * Disallow: Sitemap: http://www.mysite.com/sitemap.xml