1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Suggested improvement...

Discussion in 'Co-op Advertising Network' started by duncancarver, Feb 6, 2005.

  1. #1
    Hi Guys,

    I'd like to suggest (as time permits) an improvement be made on checking pages in Google. I have a few (but not many) "supplemental results" in google for a lot of my sites, where by the pages no longer exist on the server and therefore display no ads. When the co-op check find these pages and finds no ads the accounts become suspended and for periods of time I loose a large chunk of my weight. It's a bit of a pain in the arse, particularly when you don't discover this until 2-3 days later and the site hasn't revalidated yet.

    Best...

    Duncan
     
    duncancarver, Feb 6, 2005 IP
  2. flawebworks

    flawebworks Tech Services

    Messages:
    991
    Likes Received:
    36
    Best Answers:
    1
    Trophy Points:
    78
    #2
    You should be getting an email when this happens; and then you can go in and revalidate.
     
    flawebworks, Feb 6, 2005 IP
  3. duncancarver

    duncancarver Peon

    Messages:
    133
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Yeah the email arrives but it's no good if you are away for 3-4 days before checking your emails for example. When you're loosing such large chunks of weight as a result it can have a decent impact (negative)... and then after revalidating it takes another 4 odd days to catch back up to what you should be total weight wise. It just seems wrong to even take supplemental results into account as they are not actual results... they are "supplemental".

    Best...

    Duncan
     
    duncancarver, Feb 6, 2005 IP
  4. rtheodorow

    rtheodorow Peon

    Messages:
    129
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #4
    rtheodorow, Feb 9, 2005 IP
  5. flawebworks

    flawebworks Tech Services

    Messages:
    991
    Likes Received:
    36
    Best Answers:
    1
    Trophy Points:
    78
    #5
    I've been trying to do that; and today when I tried to enter my robots text file path; they said it was too long. So I emailed them. Now the url controller thingy is gone....

    I think I made them mad!
     
    flawebworks, Feb 9, 2005 IP
  6. lowrider14044

    lowrider14044 Raider

    Messages:
    260
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #6
    I just used that Google tool to have them remove a bunch of virtual shopping pages I think might have been the cause of me being de-listed. Worked like a charm. Thay actually removed all the pages from the cache and index in less then 24 hours. I used the "Urgent removal" link on the page. No problems with the path to the robots file but it's pretty short. Just /robots.txt. :)
     
    lowrider14044, Feb 10, 2005 IP
  7. Hank

    Hank Guest

    Messages:
    57
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #7
    I was able to get a few pages input to the removal tool just now. Hope they go away!

    Hank
     
    Hank, Feb 10, 2005 IP
  8. jarvi

    jarvi Well-Known Member

    Messages:
    127
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    103
    #8
    Are you sure you read the message correctly? I used it recently and it was that my robots.txt file was too long. Just split it up and then removed them over the course of a few days. Worked for me that way.
     
    jarvi, Feb 10, 2005 IP
  9. flawebworks

    flawebworks Tech Services

    Messages:
    991
    Likes Received:
    36
    Best Answers:
    1
    Trophy Points:
    78
    #9
    Yup. The exact message was:

    *That robots.txt file is too long. Please email for assistance!

    I'm hoping they just honor me email and wipe out those sites. It is large....
     
    flawebworks, Feb 10, 2005 IP
  10. lowrider14044

    lowrider14044 Raider

    Messages:
    260
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #10
    Guess I missunderstood. I wonder if that means GBot has a length limit and if your robots.txt file is too long it won't read it or not follow any instructions past a certain point? This could tie in with another thread where G was indexing pages excluded in the robots file?
     
    lowrider14044, Feb 11, 2005 IP
  11. skattabrain

    skattabrain Peon

    Messages:
    628
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    0
    #11
    could you do some clever mod rewrite to serve up a oops page? Can you ad coop links to a 404, would that cover it?

    i guess i'm not so sure how 404's are handled, served etc....
     
    skattabrain, Feb 15, 2005 IP
  12. alexo

    alexo Well-Known Member

    Messages:
    371
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    108
    #12
    Are u sure it's right link (email)?
    in my case i got this link: http://www.google.com/intl/en/contact/index.html and don't know what to do..

    choosing this link "Webmaster Info" i go back from where i start ..
    it's like a closed circle.

    btw ... what do u think about this "GBot has a length limit"!!!
    or this limit is only for removal tool?

    thank you
     
    alexo, Oct 21, 2005 IP
  13. joewood

    joewood Peon

    Messages:
    100
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #13
    Solution:

    I had the same problem - 100's of pages deleted and new ones added - too many to submit to Google.

    Instead, I made a 404.shtml error page that had the identical HTML as my index page. When people hit an old page, including he validator, they are shown the index page (which has DP ads on it).

    The catch is to add ".shtml" to your "AddType" line of code so that it parses. And, I've found that AddHandler works 100% of the time for me, and AddType rarely works for me (since I work through the C-panel to make the needed pages).

    This is the line of code in the .htaccess to use:

    AddHandler application/x-httpd-php .php .htm .html .shtml
     
    joewood, Oct 27, 2005 IP