Hi DP memmbers, Which is the best practise ? Using robots.txt file or using robots tag between head tags.
I too prefer using robots.txt as you can allow or disallow the folder/files or crawler from the robots.txt. But if you use robot tags, you need to put this on individual pages in order to disallow pages and you can't disallow any folder using robots tag....