Robots.txt file is used to guide the search engines automatically to the page that you want it to search for and then index the page. Most sites also have the directories and files do not need to search engine robots visit. So create robots.txt file can help you in SEO. How to create a robots.txt file is very simple, you just open the notepad or notepad ++ insert the code and save the file as ".txt" For example my robots.txt file has the following format: User-agent: * Allow: /media/ Disallow: /wp-admin/ Disallow: /wp-includes/ Sitemap: domain/sitemap.xml User-agent: used to identify bot of search engines. Allow: allows the bot to the directory, site. Disallow: blocking the bot folder, pages.
A very simple way to do create a file is yoast SEO plugin. Since it is the best SEO plugin you must be having it on your website. Goto yoast --> Advanced tool--> robots.txt editor-> Click create a file. Simple
The robots.txt file is to tell the search bots which files should and which should not be indexed by them. Most often it is used to specify the files which should not be indexed by search engines. In order to allow search bots to crawl and index the entire content of your website, you can add the following lines in your robots.txt file: User-agent: * Disallow: On the other hand, if you wish to disallow your website from being indexed entirely, you can use the lines below: User-agent: * Disallow: / For more advanced results you will need to understand the sections in the robots.txt file. The "User-agent:" line specifies for which bots the settings should be valid. You can use "*" as a value to create the rule for all search bots or the name of the bot you wish to make specific rules for. The "Disallow:" part defines the files and folders that should not be indexed by search engines. Each folder or file must be defined on a new line. For example, the lines below will tell all the search bots not to index the "private" and "security" folders in your public_html folder: User-agent: * Disallow: /private Disallow: /security