Hello. I currently have a website with about 5,000 pages indexed by google. All pages are dymacially generated and here is an exaple of the url I am currently using: http://www.planjam.com/reviewer/activities/Fun/100125. Now were are going through a huge redisign and restructuring of the database and I was kinda disappointed because many of these pages have now made it out of the google supplemental index and are seeing traffic. With the re-relaunch, many of the URL are going to be re-written and I was wondering what the best way to go about this would be. The URL currently on google will no longer exist so I will probably have to exclude google from crawling the folder in the Robots.txt file. Does anyone know how long this should take and will be it be effective? I am launching tons more content, some of which is similar, but I do not want to get hit with a duplicate penalty. Also, since many of the pages are dynamically generated, I wont be able to set up a simple rediret to each of the new page. Basically I would like to know the best way of going about this. Any help would be greatly appreciated. If I cuase a little confusion let me know and i'll gladly alaborate. Thanks. Ron