Hello. My website as of course many know : Mixupdate.com ( Signature URL ) I changed Default permalinks to a custom structure a while back. 301 redirects are already applied to the old ones but even today the old links are being indexed by Google. Not sure but maybe to change permalinks, I have to make some other changes to Wordpress settings ? Just wanted to know if everything is fine at my end before I blame Google for this . So if it is really Google who should be blamed for this then, I thought of another way of speeding up the process for Google to remove links. I've basically tried up these things up til now : --> 301 Redirected the old links to new ones ( Yet Google considers new and old as duplicate ) . --> Manually added Google parameters of old links to get them ignored through Webmaster Tool's Parameters ( Yet links are still in Google index ) . Hence, even after trying this, Google keeps up increasing my Duplicate pages although they are 301 redirected whenever Google scans . I can manually remove those pages from Google index but the removal requirements must be : -> 404 Error ( Not possible due to 301 redirection ) . -> No index tag ( Not possible since the pages are redirected to new links ) . -> Robots.txt ( Probably the one thing possible ) . So what do you guys say about Robots.txt, Should I add links all to robots.txt so they can not be Scanned by Google. Also, I usually like to keep my Robots.txt clean. So once I have added links to robots.txt, got them removed through URL Removal in Google, Can I clean-up my Robots.txt since the content is already 301 redirected ? What do you guys say about my new idea ? And what command is there to disallow specific parameters from Google like Mixupdate.com?p=1234 through Robots.txt? Also, if there's another way, Please share!
So anyone who thinks this is a Good step to achieve what I want and also knows the commands to be put into the Robots.txt, please advice .
Use TAG: <meta name="robots" content="NOINDEX"> for pages which you dont want have indexed. For all rest use redirect 301. Where been 404 html code, will use http 410 code - Gone. You can also create sitemap as xml or gzip file. Grab some new backlins to new urls and wait. Thats all.
So what should I be doing, I assume putting the old 301 redirected links to Robots.txt ( No index ) . To get them removed right ?
"NOINDEX" meta tag you need to use in html code, not in robots.txt 301, 410 Codes need to generate by PHP or mod-rewrite. Site map can generate by many free online tools. All process of reindexing is a bit long and can keep 2 - 6 weeks, depending of website. Robots.txt left as is now. Dont do nothing with it.
If you have everything right, it mean all codes and redirects to new urls then don't worry matte, everything will be fine. Just remember that after reindexing you need left all redirects and codes for longer time, minimum 3-6 months. New backlinks to new urls will make this process quicker.
User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Sitemap: www dot stylechanel dot com/feeds/posts/default?orderby=updated