I have a page that is simply a PHP page with an ad network on it and nothing else. Its used as an include on a new site that I just started running coop on. I can't put the coop ads on this page. How can I block coop from trying to validate this page and still have the page indexed? Thanks, Nevermind, I just excluded the page with robots.txt after reading some of the other threads here.
Ok, I did get that page removed from the index but Google has indexed my RSS feeds pages and obviously I cannot run the coop code on them and I'm still failing validation because of it. I don't want these pages removed from the index, isn't there anything I can do?
If you make the URLs identifiable as an RSS feed (for example with "RSS" in the URL somewhere) they will be excluded automatically.
Well... Ads not found on http://www.computernetworkinghelp.com/index2.php?option=com_joomlaboard&func=sb_rss&no_html=1 There is one other and it has RSS in the url as well. Any other ideas?
Is that the only way? I don't think I can, the URLs are automatically generated by Joomla. Note: This isn't a problem on my other Joomla site.
I've had a validation failed email reporting ads are not on one url ... trouble is, it's an image that it quoted! http://www.russianglass.co.uk/catalog/images/Insects/Snail/sgsna024.thumb (.thumb is a redirect in .htaccess to a thumbnail generator that creates it on the fly from the appropriately named .jpg) Does the checker not use the Content-type header? to skip images? Rob
It's based on Google's index... [search=google]inurl:www.russianglass.co.uk/catalog/images/Insects/Snail/sgsna024.thumb[/search]
Is there anything else I can do? Will this problem go away when more pages get indexed like my other joomla site? Does a one legged duck swim in circles?
Yeah... it's weighted based on the importance of the page. Once you have more pages, you are going to have less of a chance that one of your feeds are going to be considered one of the most important pages.
Cool, then I won't worry about it, its new so basically there are like 3 pages indexed at the moment and two of those are the problem pages. Thanks a bunch, this is a sweet program.