Sometimes I go in and check to see if I hace any errors, blocked URLs in the AdSense Reports and there have never been any problems at all. Ever! Today I went in and checked and there were tons of erros. 5 pages long of things Google says my robots.txt file has done. I do no understand WTH is going on since nothing has been changed with my robots.txt files since 2004. Anybody else getting a lot of errors or just me?
Here's a strange thing The URL's that my robots.txt file, according to Google, have blocked are pages cached by Google. Now how is this possible? What is going on?
I found a similar issue within the last few days. It said my robots.txt file was the problem. I even emptied out my robots.txt file and it is still showing about 5 blocked urls. If anyone knows what the deal is, please reply. I'm going to wait a few days, if it continues, I'll contact google.
What I am seeing, after some reasoning, is that the URL's are all cached pages with AdSense code and for some reason Google decides that it is our robots.txt files that causes the problem, or it is just a "catch-all" for their reporting. For example I see this: http:/ / cc. msnscache. com/ Robots.txt File Dec 30, 2006 3 Now, why would my robos.txt file have problem with msncache?????? WTH is going on with Google these days?????
Same here...I usually checked url in problems..nothing until now. Cached images pages, a lot of them... Maybe google is blocking adsense ads on cached pages...
yes the same problem i got , but i didnt create any robot.txt file anyway we will get that cash from blocked ULR ?
Well, I dont know if this is your case, i just solved my situation this way, I went to my google sitemap account, then I looked at every robots.txt file cached by google, it´s under Diagnostic page just press "robots.txt analysis". There you can find not the actual robots.txt file you have (in case you have just cleaned everything trying to solve your situation) but the version cached by google. If you find anything strange as i did in one robots.txt, you might want to clean that and than you can update robots file by going down on that same page and selecting "Choose an additional user-agent to check against" and pressing "check". I discovered this becouse i had the same situation and it was even blocking google boot to spider those sites, after that everything is normal. Wish this help.
Yes, I had the same issue as MikeSwede. 30K pages are indexed and usually none blocked...but earlier this week I had a long list of urls like you have described. Today I checked, and it is clean again, with none blocked.