Hi, I have a folder /FR/ with 2607 useless duplicated URL's i need to block with robot.txt and 130 i will redirect in 301. Should i enter 2607 URL's line by line or there is a way to write in robot.txt : - Block all the folder /FR/ except these 130 URL's Thanks !
Yes you can Disallow the folder, so all pages from these folder will get Disallow. Here is the syntax. User-agent:* Disallow: /FR/ You can crosscheck in Google WebmasterTool in Robots.txt section