How to stop Google from Indexing files from our second server

Discussion in 'Google' started by liza0728, Mar 1, 2011.

  1. #1
    0 down vote favorite


    Hello briliant pips. I'm actually scouring the web for the right terms for this question but after a few hours I decided to post my question here.

    The scenario is: we have a website running on two servers. So the files/website is synchronized in these two servers. We have a second server for internal purposes. Let's name the first server as www and ww2 for the second server. ww2 is automatically updated once the files are updated in www.

    Now, Google is indexing the ww2 which I want to stop and just let www be crawled and indexed. My questions are: 1. How can I removed those crawled pages in ww2 removed from Google index? 2. How can I stop google from indexing ww2?

    Thank you in advance.
     
    liza0728, Mar 1, 2011 IP
  2. ForgottenCreature

    ForgottenCreature Notable Member

    Messages:
    7,473
    Likes Received:
    173
    Best Answers:
    0
    Trophy Points:
    260
    #2
    ForgottenCreature, Mar 2, 2011 IP
  3. liza0728

    liza0728 Peon

    Messages:
    3
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    hi! thanks for your reply but I'm worried because whatever I applied to robots.txt on server2 will automatically synchronize to robots.txt of server1 and vice versa.
     
    liza0728, Mar 2, 2011 IP
  4. nirajkum

    nirajkum Active Member

    Messages:
    815
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    58
    #4
    you will have to stop the sync of robots.txt .. may be chaging permission will work for you
     
    nirajkum, Mar 2, 2011 IP
  5. liza0728

    liza0728 Peon

    Messages:
    3
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #5
    nirajkum. your response really enlightened my deeming hope to solve this issue. Anyways, I'll try to coordinate with the person in charge there and see if it's possible with our current set up. Thanks again!
     
    liza0728, Mar 2, 2011 IP
  6. ForgottenCreature

    ForgottenCreature Notable Member

    Messages:
    7,473
    Likes Received:
    173
    Best Answers:
    0
    Trophy Points:
    260
    #6
    It shouldn't if updates on w2 don't apply to server 1.
     
    ForgottenCreature, Mar 2, 2011 IP
  7. longcall911

    longcall911 Peon

    Messages:
    1,672
    Likes Received:
    87
    Best Answers:
    0
    Trophy Points:
    0
    #7
    Servers 1 & 2 should have the same domain name, even if it means setting #2 as an alias. Googlebot can only follow links, and can not browse a folder. So, it really has no idea what machine it is reaching when it requests a page. It simply requests a URL. If the request is routed to one machine today and a different machine on the same LAN tomorrow, Googlebot has no way to know it. All it knows is that it requests a page and gets a server response (header information) and page content.

    You can use something like Lynx Viewer to see what is returned to the bot when a request is made. If both machines return the same info, you're fine.
     
    longcall911, Mar 2, 2011 IP