This started about 9 months ago... I had about 2,000 pages with an address like so "site.com/dir/names" but changed them all to remove the "/dir/" part. So now they are like so, "site.com/names" I set up 302 redirects but it wasn't working with google. So I used their permanent de-indexer and it removed all the old pages. Then I removed the 302 redirects. I made a sitemap to help google find all my new pages (same pages actually but diff url). I have updated my sitemap many times and have no problem creating the sitemaps or checking their accuracy. I have even tried changing the names to (sitemap2 or sitemap3), but for some reason it keeps giving me the following errors, HTTP errors: site.com/names "In Sitemap" HTTP errors: site.com/name2 "In Sitemap" ... URLs restricted by robots.txt: site.com/name4 "In Sitemap" URLs restricted by robots.txt: site.com/name3 "In Sitemap" ... These old addresses are definately not in my sitemap. I was hoping it would finally just stop saying these pages are in my sitemap, but it has been months and it hasn't. Anyone else experiencing similiar problems?
you've posted this before havent you? Can we see your site with robots and sitemap files? When did google last read your sitemap?