Google bot visits site to collect data and if the data is fit enough to get processed through the algorithms of google search, it indexes the pages/content. Now Let us discuss the various ways we can use to get the google bot visit the sites. Give in your views!
For starters : 1. Make sitemap , submit sitemap 2. get good backlinks later try on different seo techniques.
You have to have a robots.txt file which will allow all bots to visit your website. You have to build a sitemap and submit your sitemap to Google.
Submit a sitemap for google bot that is like a road map that guides the crawler that how to bot the site. In start do directory submission and social bookmarking for targeted keywords that not only hleps in indexation but also generate backlinks that boost up your site's ranking.
I would advise you to read this blog post about increasing your crawl rate... http://www.searchenginejournal.com/10-ways-to-increase-your-site-crawl-rate/7159/
Go to Google webmaster tools and click to Fetch as google . it is the best and quickest way to index on google search engine.
For this you can do two things firstly you can submit your site to Google and the second one is you can paste your site's url to your Google + account. Both works very nicely.
So at the end of the day, all you guys suggest is : submit the site to google and social bookmark it. Thanks for your comments. More expert comments are welcome, if any!