Hi Guys, I'd like to suggest (as time permits) an improvement be made on checking pages in Google. I have a few (but not many) "supplemental results" in google for a lot of my sites, where by the pages no longer exist on the server and therefore display no ads. When the co-op check find these pages and finds no ads the accounts become suspended and for periods of time I loose a large chunk of my weight. It's a bit of a pain in the arse, particularly when you don't discover this until 2-3 days later and the site hasn't revalidated yet. Best... Duncan
Yeah the email arrives but it's no good if you are away for 3-4 days before checking your emails for example. When you're loosing such large chunks of weight as a result it can have a decent impact (negative)... and then after revalidating it takes another 4 odd days to catch back up to what you should be total weight wise. It just seems wrong to even take supplemental results into account as they are not actual results... they are "supplemental". Best... Duncan
I've been trying to do that; and today when I tried to enter my robots text file path; they said it was too long. So I emailed them. Now the url controller thingy is gone.... I think I made them mad!
I just used that Google tool to have them remove a bunch of virtual shopping pages I think might have been the cause of me being de-listed. Worked like a charm. Thay actually removed all the pages from the cache and index in less then 24 hours. I used the "Urgent removal" link on the page. No problems with the path to the robots file but it's pretty short. Just /robots.txt.
Are you sure you read the message correctly? I used it recently and it was that my robots.txt file was too long. Just split it up and then removed them over the course of a few days. Worked for me that way.
Yup. The exact message was: *That robots.txt file is too long. Please email for assistance! I'm hoping they just honor me email and wipe out those sites. It is large....
Guess I missunderstood. I wonder if that means GBot has a length limit and if your robots.txt file is too long it won't read it or not follow any instructions past a certain point? This could tie in with another thread where G was indexing pages excluded in the robots file?
could you do some clever mod rewrite to serve up a oops page? Can you ad coop links to a 404, would that cover it? i guess i'm not so sure how 404's are handled, served etc....
Are u sure it's right link (email)? in my case i got this link: http://www.google.com/intl/en/contact/index.html and don't know what to do.. choosing this link "Webmaster Info" i go back from where i start .. it's like a closed circle. btw ... what do u think about this "GBot has a length limit"!!! or this limit is only for removal tool? thank you
Solution: I had the same problem - 100's of pages deleted and new ones added - too many to submit to Google. Instead, I made a 404.shtml error page that had the identical HTML as my index page. When people hit an old page, including he validator, they are shown the index page (which has DP ads on it). The catch is to add ".shtml" to your "AddType" line of code so that it parses. And, I've found that AddHandler works 100% of the time for me, and AddType rarely works for me (since I work through the C-panel to make the needed pages). This is the line of code in the .htaccess to use: AddHandler application/x-httpd-php .php .htm .html .shtml