This is the story. This is a new site. I have a Java script to enter the articles I write. After about a month of using it I changed the way I named pages to make it more SEO. I am now naming the URL's manually using the title of the article. Before that is was just spitting out numbers for the pages. So, hopefully it will help in the long term. Here is the problem. I already had 85 pages indexed by Google using the numbers for the URL's. I re-named all of those using the title of article. Now Google is showing 85 pages not showing up for my web site. They are penalizing me for this. In my crawl stats I have lost all of my medium rating and had a portion put back in not yet assigned, which I had the site completely out of until this happened. Reading Google I found out I could put a robots.txt file on my server to inform them to disregard those pages. Is this correct? Is there a better way to do this? Since I have no idea how to do this, is there a tutorial someplace I could follow? Any other comments I have not thought of would be greatly appreciated. I have worked really hard on this site and thought I would get a PR4 or at the worst a PR3 with the first update. But, now it is looking like a PR2. Dam, I would have loved to get the PR4. Is there anyway I can still save it. Any help with this and I would be grateful for life. Well not really for life, but I would appreciate it! Thanks Bill