Is anybody else getting validation problems with some of their sites since the loss of the API system? I get errors like 'Ads not found on domain.com/somepagethatdoesntexist' and the page not only does not exist on the site itself but also is not indexed in Google either. I don't really know where the co-op is getting the information about that page unless its just randomly making up webpage names! Is there any possible workaround? Its not like I can just add these pages to my robots.txt because they are already not indexed by Google...
make sure you add a good 404 error document line in your htaccess this sometimes fixes errors with missing pages
My sites have been failing validation on things like RSS feeds and other pages that can't include co-op links. If you keep trying to validate, eventually it will succeed.
Ok I managed to validate some by simply validating over and over until it succeeded like you suggested. However, two sites have even more serious problems. When I try to validate I DO NOT get an error like: Ads not found on http://www.domain.com/page.html I get an error like: Ads not found on t or Ads not found on V or even Ads not found on ! This doesn't make any sense at all. Its not even referring to pages on the site at all, simply it refers to letters of the alphabet!
How do I do this? That's maybe any even better idea! Although it doesn't work with everything. For example on page without ads on is style.css so I don't know how to put ads on that. Another page was portal/network/index.cfm?FuseAction=SearchResults&Letter=F&startrow=1&orderby=c.username&profiletype=community&profilecategoryid=&area=Find. Also it doesn't solve my problem I described in the post above since the errors don't even seem to be about pages on my sites...