Hi I am curious, because recently google webmaster told me that A LOT of my URLS are suddenly blocked recently, and by robots.txt file though I didn't do anything. Thus I tried to get rid of the robots.txt files by making my blog originally it was User-agent: * Disallow: /search Allow: / NOW it is: User-agent: * Disallow: Allow: / Is that correct? Will my current one, result in duplicate content issues? Thanks so much for your help!
wait what's the point of that code is everything is allowed and nothing is disallowed? Simple delete all the code you've mentioned, it will have the same effect. Other than that it's "fine".
hahahaha ^ this exactly "I'm going to make a door, and always leave it open." ....o...ok.... OP, did you try just removing everything first? Maybe your site's just going through a little dance. Get a sitemap and robots validator to make sure nothing got changed. Might as well check .htaccess for any changes recently too