It is quite simple. You need to first create a robots.txt file that will include following lines: User-agent: * Disallow: /the folder or file you want to hide from search engines Once you are done with this, upload this file in root folder of your website.
create an empty file named robots.txt and then sign up for google webmastertool and then you will have an option to select which folder to disallow and which folder to allow.
You can create and configure robot.txt file in easy and simple steps. Just create a robot.txt and upload it in your server. This link will help you to configure your robot.txt file in a better way. How to Create Robot.txt File