Witch one of those schemes works best for a custom blog on robots.txt: User-agent: * Disallow: Code (markup): or User-agent: * Allow: / Code (markup):
thanks @seoshel, @shelinauto I keep it with Disallow: Also I'm planning to add Sitemap: http://www.website.com/sitemap.xml Code (markup): to robots.txt. I want my articles to get indexed faster by Bing/Yahoo. Is this a good way for this?
I am not an expert, but in my eyes it looks like both of the codes do the same thing. Regarding your question about adding a sitemap: yes, you should definitely add a sitemap to your website. The articles should get indexed by Google faster, but that's no guarantee. But you should totally add one.
@Sweely, I've put the question, because our website doesn't get indexed very well. We are using Google Webmaster to manage our sitemaps, but we are planning to be more visible, and more updated on Bing/Yahoo also. @Prodan_13 actually it doesnt block it because that rule is for unblock everything.
Yes. Adding a sitemap.xml code in your robots.txt works. I used that to my sites as well. I think its effective
first one is the right choice. You do not need to guide robots for allow signal since allow is by default their rights. only disallow should be used in robots files. Moreover you should always include the sitemap path after one blank link. Sample below:
They both work same.. But using the first one leaves room for blocking the directories and files you do not want to be crawled.
Yes it Does. Good grief learn a subject before commenting then you wont put both your feet in your mouth.
Hi................. You have to gone through first coding of robots.txt which you can put in root folder of your website. Keep it up.
I read that these code are important for any site .Many people don't want to crawl some pages in her site then they used first code for her aims.
you stated the first code would not work this is the first code User-agent: * Disallow: I am telling you It WILL work