OK, this seems to be an almost daily thing... I have 12 or sites running coop. why do I get emails saying coop ads not present and when I go revalidate everything is fine? Different sites, not the same ones... I've asked before but never got an answer. Can't you try to revalidate again before deactivating and sending emails??? If I can revalidate - aren't the ads being served properly? Thanks, Mike
Thanks for reading Smyrl, you're nice, but your response doesn't make sense. Every day a different site of mine has the ads turned off because they don't revalidate and they send me an email saying my ads were not found. When I go to check ads are always working. I wish that they'd check better, more often, more thoroughly before shutting them off. I think they should send a warning FIRST and then if a set period later still no ads - then they should send mail and deactivate. Am I the only one this is happening to???
I got an email saying my site doesn't have ads on a page, but the page is a directory listing. Sounds like a minor bug.
When I go to that URL I don't see any ads which is what the problem is. You need to have a page at that URL with ads on or make sure that page is no longer indexed in Google.
as long as your validated? Bull,im validated,others adds show but my bloody text is rejected. Every time I get an email like that that says ads were not found and I check the URL - my sites are showing ads. It seems odd that they weren't validated. Is there a timeout or something?
I get this problem all the time - so for a newbie how do you 'block' a folder/directory from being indexed? Thanks
Look at a robots.txt tutorial. Your robots.txt file is very easy to make. You can also put files you do not want indexed in a password protected directory. Shannon
"as long as your validated? Bull,im validated,others adds show but my bloody text is rejected." I didn't write that??? Why was it in a post with my handle???
another part of the puzzle for you may be this. Revalidating pulls from different google datacenters. You may have validated with a datacenter that doesn't have your ad-less page indexed. And then when the auto validate hits on a datacenter that has that page indexed... you are out again.
Okay read up on the robots.txt - but learned nothing new If I bar the robots from indexing a folder - does this not also bar them from indexing ALL files in that folder? Seems like no-one has an answer to this I've lost about 4k weight over this issue as I pulled the site from the co-op. This site was the first I ever submited to DP & has run okay since. But lately G has chosen to index the folders (as well as pages/files).Of course I cant serve ads on a folder & consequently DP says my ads are not validating.
you can disallow individual files, or move unwanted files into a disallowed directory User-Agent: * Disallow: /yourpage.html
Sure But if I disallow "the folder" then all files (pages) inside the folder are also disallowed. In effect taking my site out of the index. Little drastic just to get dp ads showing on de-indexed pages To make this clear eg URL http://www.mydomian/loans/top-company.htm Google is indexing both http://www.mydomian/loans/ AND http://www.mydomian/loans/top-company.htm