I just thought of something....i'm sure a lot of people maybe doing this... Get an established domain...and on the Robots.txt you put User-agent: * Allow: sitemap: http://domain2.com/sitemap.xml # BEGIN XML-SITEMAP-PLUGIN Sitemap: http://domain2.com/sitemap.xml.gz # END XML-SITEMAP-PLUGIN Some people know this but what happens if your domain is domain1.com and you put that code for domain2.com in the robots.txt for domain1.com? Of course this would be to crawl domain2.com faster ...especially if it's a new domain. Your thoughts?
I can't see why this won't work. I have a couple high ranking sites. I will test and see what sort of results it yeilds.
hmm... it could work... some real case stories would be great! did you test this method with some domains? and what happens?
It will not work, I have dynamically generated sitempas for my domains and during testing had some mixups with the domains. The only thing that happens is that google will download the sitemap from domain1, read it and find links for domain2 and report it as an error. It will not use the sitemap until the sitemap downloaded from domain1 only has links in it to domain1.
Seems there is a way to do this, and it's not even black hat http://www.google.com/support/webmasters/bin/answer.py?answer=75712
well it might get pages index faster but it still wont boost the rankings so im not so sure its "black hat" seo.
It seems there are much easier ways to get your new site crawled without having to do this. If anything it seems like it would penalize site #1 in the long run.