Yeah, and if you do it properly you get SMACKED because it can actually make Google de-index other sites. Strip out formatting, replace it with your own. That will kill the duplicate content problem, and dodge de-indexing. +1 points for inserting a meta tag to deny the google cache. NOW. If you ever really want to have fun with a proxy(who may be jackin your content), start using it to to visit itself, and then get THOSE pages indexed. So once you have a single page of it indexed where it's really the proxy looking at itself, there's a way(use your imagination, I'm not saying it here) to make GoogleBot start looping crawling the same crap over and over with different urls. And all those URLs are on the proxies domain. So eventually GoogleBot just gets angry because it crawled 10,000 dupe content pages. Takes a bit of work, but the havok+bandwidth makes it worth it. When I get the time, I'll figure out exactly which pieces of proxy software it works with. heh. Blackhat that