So, I've been wondering a couple of days why google isn't crawling my new blogs. ( I have links to them from highly relevant sites.) Now when using google webmaster tools i discovered that there is robots.txt files that doesn't let google through. I go and check on the godaddy ftp but can't find anything. Then I write xxx/robots.txt in by browser and discover that Godaddy has applied an exlude all command there. Nice. Why in the world are they doing that? Any ideas?
damn were you able to fix the issue? I'm also ussing Godaddy but have no clue about robots.txt or any of that.
Yeah, I just added an empty robots.txt to each sites root folder. I'm wondering though why in Gods name is godaddy doing this? I've used metropolis to install wordpress. I think everyone should check if they have the same problem.
paowiee here's a little robots.txt tutorial i looked up for ya If ya need any help figuring it out, PM me, i'll be glad to help -Maybe it's their master plan to keep all new hosting sites from competing! ! or to keep server traffic down? Or maybe they're just dumb lol
They are just dumb, they make the stupidest moves ever when it comes to more than registering a domain .