Robots.txt for parked domains

Discussion in 'Google' started by Johnny_D, Apr 28, 2008.

  1. #1
    I have a small problem
    i have a site : mysite.com
    Also i have my-site.com parked on mysite.com. How do i restrict my-site.com from being crawled ?

    Please anyone...10x
     
    Johnny_D, Apr 28, 2008 IP
  2. j4l4ni

    j4l4ni Peon

    Messages:
    49
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    i dont have idea for that. :( sorry..
     
    j4l4ni, Apr 29, 2008 IP
  3. RankSurge

    RankSurge Banned

    Messages:
    273
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #3
    on your robots.txt page:

    User-agent: googlebot
    Disallow: /

    and if you have a page for that site and you want to prevent any indexing:
    <META NAME="revisit-after" CONTENT="never">
    <meta name="robots" content="noarchive">
    <meta http-equiv="pragma" content="no-cache">
    <META name="VIEWER-GOOGLE" content="NODISPLAY">
    <META NAME="GOOGLEBOT" CONTENT="NOARCHIVE">
     
    RankSurge, Apr 29, 2008 IP
  4. angilina

    angilina Notable Member

    Messages:
    7,824
    Likes Received:
    186
    Best Answers:
    0
    Trophy Points:
    260
    #4
    Use the Robots.txt file to do it.

    Read this

    robotstxt.org/faq/prevent.html
     
    angilina, Apr 29, 2008 IP