I have one website but google spider doesn't spider it so what am i do for crawling my site.... and take me the some tips for for crawling my website......
i also done the robot.txt and analytics set up and all content of my site are really making new by content writer.....
After submiting your website ping to the major booking sites like delicious, digg, reddit, pingomatic.com......... dis might help in crawling..
i also done all of this like high PR directory submission, social bookmarking submission, article submission but i don't get any result.... so what can i do..........
Thats a good start, but you also need to be updating your content on your website or at least adding new content every week.
Google made recent update for content and this is checking website internal content duplication as well as external. If your website or blog has duplicate content then there are chances to be de-index or get penalize on google.com. you should check your website and remove all html errors(i.e. duplicate title, meta description, meta keywords) and Googlebot fatches errors. These process will help you to overcome from de-indexing problems.
First check robots.txt if it is proper then create XML sitemap and submit it into Google webmaster tool and after that apply all Off Page Optimization techniques..