Google Blackhat for Sitemap

Discussion in 'Google Sitemaps' started by banker0679, Dec 2, 2008.

  1. #1
    I just thought of something....i'm sure a lot of people maybe doing this...


    Get an established domain...and on the Robots.txt you put


    User-agent: *
    Allow:
    sitemap: http://domain2.com/sitemap.xml

    # BEGIN XML-SITEMAP-PLUGIN
    Sitemap: http://domain2.com/sitemap.xml.gz
    # END XML-SITEMAP-PLUGIN

    Some people know this but what happens if your domain is

    domain1.com

    and you put that code for domain2.com in the robots.txt for domain1.com?

    Of course this would be to crawl domain2.com faster ...especially if it's a new domain.

    Your thoughts?
     
    banker0679, Dec 2, 2008 IP
  2. babyLEO

    babyLEO Peon

    Messages:
    42
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    It might help in crawling the new domain. I think it will work.
     
    babyLEO, Dec 8, 2008 IP
  3. SanDiegoGraphics

    SanDiegoGraphics Member

    Messages:
    47
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    31
    #3
    I can't see why this won't work. I have a couple high ranking sites. I will test and see what sort of results it yeilds.
     
    SanDiegoGraphics, Dec 8, 2008 IP
  4. banker0679

    banker0679 Well-Known Member

    Messages:
    407
    Likes Received:
    20
    Best Answers:
    0
    Trophy Points:
    110
    #4
    the sole intention would be for the new domain...
     
    banker0679, Dec 8, 2008 IP
  5. onlywin

    onlywin Greenhorn

    Messages:
    97
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    18
    #5
    hmm... it could work... some real case stories would be great!

    did you test this method with some domains? and what happens?
     
    onlywin, Dec 12, 2008 IP
  6. pstevens

    pstevens Peon

    Messages:
    6
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #6
    It will not work, I have dynamically generated sitempas for my domains and during testing had some mixups with the domains.

    The only thing that happens is that google will download the sitemap from domain1, read it and find links for domain2 and report it as an error.

    It will not use the sitemap until the sitemap downloaded from domain1 only has links in it to domain1.
     
    pstevens, Dec 16, 2008 IP
  7. pstevens

    pstevens Peon

    Messages:
    6
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #7
    Seems there is a way to do this, and it's not even black hat http://www.google.com/support/webmasters/bin/answer.py?answer=75712
     
    pstevens, Dec 19, 2008 IP
  8. onlywin

    onlywin Greenhorn

    Messages:
    97
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    18
    #8
    thanks pstevens! good information!
     
    onlywin, Dec 20, 2008 IP
  9. bauang

    bauang Peon

    Messages:
    12
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #9
    Banker0679, did you try it? How did it go?
     
    bauang, Dec 20, 2008 IP
  10. lomazoma

    lomazoma Active Member

    Messages:
    56
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    93
    #10
    i cant understand !! how i can do it
     
    lomazoma, Jan 5, 2009 IP
  11. Sara Rana

    Sara Rana Banned

    Messages:
    32
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #11
    well it might get pages index faster but it still wont boost the rankings so im not so sure its "black hat" seo.
     
    Sara Rana, Jan 7, 2009 IP
  12. Elward

    Elward Peon

    Messages:
    135
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #12
    It seems there are much easier ways to get your new site crawled without having to do this. If anything it seems like it would penalize site #1 in the long run.
     
    Elward, Jan 7, 2009 IP