I keep seeing these 403 errors in Google's Webmaster Tools (formerly G Sitemaps). Like 99.9% of other sites out there, mine blocks the display of directories with no index.php file. Like this site for example But the directories' page contents are still crawlable and viewable to any visitor, browser or bot like any other site. The only thing that they cannot see is a list of all the files in the directory, just like most other sites. But I keep getting 403 errors for every directory on my site, although there is nothing out of the ordinary in my htaccess or robots.txt -- is this a Google bug? Are some of you also seeing this from time to time too?
If there exists a link to http://example.com/mydirectory/ and you block it... Then Google reports that url as an error. As long as you do not receive errors for e.g. http://example.com/mydirectory/mypage.html then you are all good.
I don't have any internal links to http://example.com/mydirectory/ on my site. But there could be someone linking to my site using such a link, and that could likely be the cause of the errors. Another funny thing, is that I see several dozen links "Not found" errors for files that have not existed on my site for three years! I been scratching my head over this one too. Maybe both of these have the common cause of some site somewhere linking to this files/directories.
I would recamend using .htaccess to 301 redirect the old files to your home page and (if you have php) creating index.php files in all your directories with the code <?php header("HTTP/1.1 301 Moved Permanently"); header('Location: http://example.com'); ?> Code (markup): That should keep google happy.