For example, your robots.txt file would cointain: # robots.txt generated at http://www.mydomain.com User-agent: * Disallow: Disallow: /cgi-bin/ __________ it is a text file located in your root directory
Put the file named 'robots.txt' on the root of your server. It will be automatically detected by the search engines.
Best place to get robots.txt specifications is directly from Google itself. Easy to read, easy to use, great resource. Robots.txt Specifications - https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt
You can create the file using notepad by copying the template from other website's robots.txt and edit it according to your own preference then save "robots.txt" as the file name. You can also use online tools to generate the robots.txt file. After you got the file, upload it to public_html directory.
Copying someone else's robots.txt only works if your sites are similar and you know what you're doing. Use the above guideline and it will help you out immensely. You're probably as good doing a Google search or better than taking the risk of copying someone else's incorrectly created robots.