Auto weight update

Discussion in 'Co-op Advertising Network' started by Jayess, Jan 26, 2005.

  1. #1
    Occassionally my sites drop out because the links are found on certain pages.

    This is always my word documents! How can I put the ads on a .doc? :confused:

    Anyway, I normally get the email, login and revalidate and all is fine. Until it drops next time.

    Now, I'm going on holiday next week.

    The autovalidate on day 1 kicks me out because it can't find the links.

    Does it try and validate on day 2?

    Does it every try and validate again? Or do I have to manually do it?
     
    Jayess, Jan 26, 2005 IP
  2. SEbasic

    SEbasic Peon

    Messages:
    6,317
    Likes Received:
    318
    Best Answers:
    0
    Trophy Points:
    0
    #2
    You need to exclude the .doc's with robots.txt I think.
     
    SEbasic, Jan 26, 2005 IP
  3. Jayess

    Jayess Peon

    Messages:
    87
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #3
    I also thought that. But I want google to find and read and index them!
     
    Jayess, Jan 26, 2005 IP
  4. T0PS3O

    T0PS3O Feel Good PLC

    Messages:
    13,219
    Likes Received:
    777
    Best Answers:
    0
    Trophy Points:
    0
    #4
    Then you need to put the doc's below the base url tree. Or the base url above the doc's in the tree.
     
    T0PS3O, Jan 26, 2005 IP
  5. Jayess

    Jayess Peon

    Messages:
    87
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #5
    ermm? Could you explain more?
     
    Jayess, Jan 26, 2005 IP
  6. T0PS3O

    T0PS3O Feel Good PLC

    Messages:
    13,219
    Likes Received:
    777
    Best Answers:
    0
    Trophy Points:
    0
    #6
    T0PS3O, Jan 26, 2005 IP
  7. Jayess

    Jayess Peon

    Messages:
    87
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #7
    Right now I understand, and I tried that, but then I get 0 weight. (Obviously) because the pages google knows about are in the root.

    Shawn, would it not be sensible to code the coop to ignore .exe's .doc's etc?

    Or, should I say to only count the url if it's html, php, asp, etcetc
     
    Jayess, Jan 27, 2005 IP
  8. nevetS

    nevetS Evolving Dragon

    Messages:
    2,544
    Likes Received:
    211
    Best Answers:
    0
    Trophy Points:
    135
    #8
    maybe put the docs in a subdomain (docs.domain.com) and link to them from your main domain (www.domain.com).

    The problem is that the indexed docs count towards your weight and there isn't a way to check how many .htm .php .asp etc type pages are indexed. So the coop has a problem in figuring out how much weight to give you if it doesn't count .doc files. You could have 1000 pages indexed, and 980 of them are word docs so only 20 pages display ads, and the coop would mistakenly give you 1000 pages of weight. That's probably not the case with you, but it would be a short time before others also started taking advantage of a "loophole" if it became apparent.

    If the docs aren't a duplicate of your site data, you could "save as web page" them and link to the files and have a download link to a subdomain or alternate domain.

    When you move them, the co-op will still be looking for them because they will still be in googles index. (so will the google-bot). So after you move them, redirect the link with a 301 (permanently moved) response. (if you are on an apache server, you can do that in your .htaccess file)
     
    nevetS, Jan 27, 2005 IP
  9. Jayess

    Jayess Peon

    Messages:
    87
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #9

    Now that's a good idea. Gonna be a pain, lots of docs lots of sites.
     
    Jayess, Jan 27, 2005 IP