Hmm, couple of questions here. The main thing is that I'm trying to get out a lot of links with parameters from google's index. So, here are the questions: 1. can i do this with the removal Tool in Google webmasters? I've searched the web and it seems i can't. 2. I've already cut all these pages from robots. This means the pages won't be indexed from this point anymore, but all the pages from the index are still there. Theoretically, they will be eliminated gradually during time, but it will definitely take a couple of months, if not years (we're talking about 200k pages). So, how can i hurry this eliminations process? I was thinking about sticking a HTTP 404 status code on these pages, but still display their content (i need to do this, cause all these pages will still remain on servers). Will this trick do any good? Looking for any advice here. 10x
Why not just 301 (permanent) redirect all these pages to your main page, or their new respective versions? Letting pages drop from the index hurts your SERPs.
Rest assure 301 was the first thing that passed my mind. But, in this case it won't do me any good. Redirecting 200K pages to my main pages is not a solution. Redirecting the old pages to the new ones would be the best way to do it, but my programmers claim it's impossible due to some framework incompatibilities. (we've changed the framework, long story )