Hello all, I've been doing some research and I believe this may be possible, but for the life of me I can't figure it out! I've recently rearranged my site, adding subdirectories so I could URL rewrite more efficiently (previously, every page on my site was root directory, and it was very slow.) However, google has my old pages cached still, take my articles for example. old address: site/article1.html new address: site/articles/article1.html For the benefit of users coming in from search engines, I put a link "The article you're trying to access, "Article Title" has been relocated to the following address, please update your bookmarks: (link)" Well...I guess with the article title and link in there, Google decided they were all unique pages, so G indexed almost every one of these gateway pages which were intended for actual visitors, not SE's. Despite the crawlable link, G yas yet to find and cache all of the articles in their new location. What should I do? I don't mind simply removing these 'gateway pages' from the cache, I know that G will find my articles eventually, they're linked from all over the site. 1. Try to put a nocache in the header of these files so they dont bother? 2. use .htaccess to redirect requests for article1.html over to /articles/article1.html (I would prefer for this to be dynamic, I dont want 200 entries in my .htaccess - that's why I moved my whole site!) 3. Is there a robots.txt option? Any experience with this?
Nevermind! I had forgotten about the php header option, can anyone see a problem with this usage? Put this in article1.html (which is actually a series of php files) In my code, $pid = page ID, in this case 1 Header( "HTTP/1.1 301 Moved Permanently" ); Header( "Location: http://www.tubanews.com/articles/article$pid.html" );