Hi Everyone, I want some help in creating robots.txt for my website. I want to disallow some URL patterns but I don't know, how to write that syntax in robots.txt Example: URLS of my websites are like: http://www.example.com/portfolio/portfolio1/ http://www.example.com/portfolio/portfolio2/ Now, I want disallow all URLs after http://www.example.com/portfolio/. I mean I want to disallow portfolio1 , portfolio2....... pages... Is there any syntax for it? Thank you in advance
Your robots.txt will be like this: User-agent: * Disallow: /portfolio/portfolio1/ Disallow: /portfolio/portfolio2/
Hi, you can use my robots.txt for reference: bestpedia[dot]in/robots[dot]txt I've prevented multiple directories from being indexed. You can ignore the Attracta lines because they're no longer in use.
Thank you for the answer. But I have multiple url like portfolio1, portfolio2.........portfolio500... so I am looking for some pattern to disallow all using one line.
Thanks for the reply. I have get an idea from your reply and created a syntax like: Disallow: /portfolio/portfolio And it works for me because I want to index portfolio page but don't want to index it's sub pages like portfolio1, portfolio2.....Thank you.