I found this robots.txt here http://www.askapache.com/seo/seo-with-robotstxt.html What i want to ask... if i use this robots.txt the search engines will index every page i write ? i want every page be indexed like example : myblog.com/music
robots.txt file just tell the search engine bots about which page are meant to be indexed and which are not. Using a robot file helps in excluding the pages which you do not want t get indexed.
Yes but i want to know if the example robots.tx will let search engines index my new pages (new posts) not only the domain , i want to know if any post will be indexed as different page to search engines
Do you have a different page for each post (most of the blogs do have).? Number of index pages do not depend much on robot file. Try getting good backlinks and interlink the pages on your blog in a better manner.
Yes but i was thinking if using this robots.txt http://www.askapache.com/seo/seo-with-robotstxt.html to not block the pages created for every post
the robots.txt file you linked to http://www.askapache.com/robots.txt is GREAT, just do a search for site:www.askapache.com on google and you will see how effective it is. The problem with what you are asking though is you have it backwards. Robots.txt doesn't tell anyone to index pages, it tells everyone what pages to NOT index. A sitemap sounds like the solution for you.
just use simple robots.txt file is better than use complicated structure. here is mine. robots still able to crawl new posts that i made. btw if you want to check whether your robots.txt work in the right why, you can check in Google Webmaster Sitemap. -cypher.
As explained here a: - robots.txt file is used to stop certain/all search engines seeing certain pages and/or folders. - The meta tag, noindex is used to allow search engines to crawl pages but they are told not to index them. - A sitemap is used to tell search engines what pages you would like indexed and when they were updated. Late
I deleted robots.txt , i'm so confuse , i want every post be indexed for the keyword i made this is what i want , if i write about best hosting a page , www.myblog/best-hosting/ to rank and be indexed for "best hosting"
there are many things that qoogle use to rank your page in google serp. not only because of the robots.txt. -cypher.
newzone, the only sure way to get google to index content is for google to find a LINK to that content. So if google already has your homepage indexed, a link on your homepage to /best-hosting/ will ensure that google finds it.
The only thing I would suggest to add to that robots, is the sitemap location... For example: User-agent: * Disallow: /includes/ Disallow: /images/ sitemap: http://www.yourdomain.com/yoursitemap.xml And then if you really want to go the extra mile which it seems you do, use the old fashion python sitemap generator as found on Google Webmaster site. Run it as a cron or some scheduled job as often as you feel the need and for the directory you have the posts going in, set it to "daily" using the option where it crawls your site looking for files. (Sitemap you can have daily, weekly, monthly set to pages.) The reason I say the sitemap from google's site because after you run it, it notifies Google that a new sitemap has been generated. You will surely give yourself the best chances this way.