Index all pages of a website.

Discussion in 'Search Engine Optimization' started by Ashley Sally, Jan 2, 2013.

  1. #1
    I want Index all pages of my website. which is the robots.txt are used
    1.
    User-agent: *
    Disallow:

    or

    2.
    User-agent: *
    Allow:/


    I am use current No.1, can this is right.
     
    Ashley Sally, Jan 2, 2013 IP
  2. ShrinkDWorld

    ShrinkDWorld Peon

    Messages:
    66
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    these files are used to stop spider to access your links. if you want to prevent some links from indexing add them in that robots.txt file & apply disallowed.
     
    ShrinkDWorld, Jan 2, 2013 IP
  3. James Byun

    James Byun Active Member

    Messages:
    820
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    55
    #3
    Go to google webmasters tool, and submit your site there. The crawling can take a couple of hours or days, but shouldnt take that long. It will also tell you what your missing to get crawled better.
     
    James Byun, Jan 2, 2013 IP
  4. SS-Q

    SS-Q Banned

    Messages:
    118
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    53
    #4
    Use #1. Technically you don't even need a robots.txt file, but it is best practice to have one. It's really meant for blocking urls from search engines. If you are having problems getting all of your site indexed, ensure you have a xml sitemap and submit it to Google Webmaster Tools. They may give you hints as to why some pages are not getting indexed. Regularly creating fresh content and getting quality links can also help get your site indexed quicker.
     
    SS-Q, Jan 2, 2013 IP
  5. ntegrityit

    ntegrityit Greenhorn

    Messages:
    41
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    10
    #5
    right command to allow all the spiders and crawlers.


    User-Agent: * Allow: /
     
    ntegrityit, Jan 2, 2013 IP
  6. traxport121

    traxport121 Active Member

    Messages:
    1,201
    Likes Received:
    8
    Best Answers:
    1
    Trophy Points:
    63
    #6
    It is better not to play with the robots.txt file. It could prove very dangerous if a mistake occurs. Submit a full sitemap in Google webmaster central and update your site regularly. Create backlinks and improve your social media. There would be left no reason for any page of your site not indexed in Google.
     
    traxport121, Jan 2, 2013 IP
  7. seo.seophalanx

    seo.seophalanx Greenhorn

    Messages:
    32
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    16
    #7
    I agree with traxport. Add your new site to Webmaster Tools, then go to Optimization > Sitemaps and add the link to your website’s sitemap to Webmaster Tools to notify Google about it and the pages you have already published.
     
    seo.seophalanx, Jan 2, 2013 IP
  8. dataguru

    dataguru Active Member

    Messages:
    196
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    58
    #8
    First use the option 2 for robot.txt and for google crawling too. Then submit your website in webmaster tools.
     
    dataguru, Jan 2, 2013 IP