I want Index all pages of my website. which is the robots.txt are used 1. User-agent: * Disallow: or 2. User-agent: * Allow:/ I am use current No.1, can this is right.
these files are used to stop spider to access your links. if you want to prevent some links from indexing add them in that robots.txt file & apply disallowed.
Go to google webmasters tool, and submit your site there. The crawling can take a couple of hours or days, but shouldnt take that long. It will also tell you what your missing to get crawled better.
Use #1. Technically you don't even need a robots.txt file, but it is best practice to have one. It's really meant for blocking urls from search engines. If you are having problems getting all of your site indexed, ensure you have a xml sitemap and submit it to Google Webmaster Tools. They may give you hints as to why some pages are not getting indexed. Regularly creating fresh content and getting quality links can also help get your site indexed quicker.
It is better not to play with the robots.txt file. It could prove very dangerous if a mistake occurs. Submit a full sitemap in Google webmaster central and update your site regularly. Create backlinks and improve your social media. There would be left no reason for any page of your site not indexed in Google.
I agree with traxport. Add your new site to Webmaster Tools, then go to Optimization > Sitemaps and add the link to your website’s sitemap to Webmaster Tools to notify Google about it and the pages you have already published.
First use the option 2 for robot.txt and for google crawling too. Then submit your website in webmaster tools.