i want index my site asap in google so i added the robots txt in my site like that User-Agent: * Allow: / is this right?
I think if you use this command it works the same.. User-Agent: * disallow: And btw, there is no way you can index your pages in search engines by any effort. You have to wait for a right time and robots.txt file and sitemap does not give your confirmation that your site will be indexed=) .
Allowing everything is the same as simply not having a robots.txt... If that's what you want, just delete the file.
adding a robot.txt file will not force or speed up the indexing process. But it may be beneficial when the spiders DO start arriving - particularly if you have folders or pages that you do NOT want indexed. If you want to put out some "spider food" just go over to Squidoo.com and drop a couple of topic related articles with backlinks to your page(s) Wishing all the best Dan B. Cauthron