Recently unintentionally got snagged by the google duplicate content filter (I think). I have a site not ranking for its keywords & showing many supplemental results pages. 2 sites were created. One targeting keyword city and the other keyword state. I am in the process of tearing apart both sites and rebuilding them, rewording them, changing titles & meta data so there is no way any content matches. My question is. What is the best way to flush the old data from google & recover from this? Should I A. Rename all old & new website pages and let googlebot get a 404 error. B. Keep the filenames of the pages the same with the updated content. C. Rename all old & new website pages | and create webpages with a single “click here†link for all the old pages? At first I was going to do C. But then I figured, I just want google to flush that content ASAP, is a 404 error a good way to do this? Or will it just hang around in cache for a very long time? Maybe option B is the best? Any advice would greatly help.
301 redirect all the old pages to the new pages (or to the homepage) and create a whole new set of pages. Better still ditch one of the sites and redirect it to the other one.
Yeah prolly 301 to the homepage is best bet right? That way everything is in the clear and just let the bots find & spider the new pages / content. By the way, how long do you suggest leaving the 301 in place?
So after the 301 is put in place (old document > Homepage), the next time google tries to spider that document again it will find the redirect and should drop the page from it's index right?