in my google webmasters tools my site is showing crawl errors and 120 urls are shown to be (Restricted by robots.txt ‎(120)‎.) Please if you know about it give me the suggestion that what these are and how can i fix them..
There are plenty of sources online that explain what to do with a robot.txt file, but its to my understanding that many spiders simply ignore robot.txt files and spider everything regardless. I think your web guy, probably but a NO Follow on some major directories. In an attempt to block spiders from spidering redundant information? Maybe you should look at your website via a txt based style such as Lynx which is suggested by google... as this is how a spider sees your website... the textual links... no flash, no javascripts, etc. If these links are messed up...then yea you are going to have problems.
When Google bots cannot access a URL it details the problem crawling the link in WT. In your case Google finds it difficult to crawl those 120links which are restricted by robots.txt! As said in the before post contact your technician and have a look at your page in Lynx based and to this you will need to allow access your site to the Lynx Viewer to prove that you are the webmaster of the site!
Check your robots.txt carefully whether the rules there are making any restriction for indexing. It is easy to edit robots.txt file as it is a text file. Only then all your urls will get indexed.
Google Web Master tools usually tells you exactly what is erroring out. Everyone here is right, sites like Joomla and wordpress usually have restricted commands in robots so the engines are not trying to spider your backend and modules.
You can write robots again, maybe you site have some Duplicate content,you can Restricted by robots.txt If you don't know how to write robots.txt, you can search it(how to write robots.txt) on Google.
Always put unique & infomercial content on your website so public can come to your site & stay for sometime & read.. If your website gets a good traffic then google definitely indexed your webpage..
you have to see whats on those pages. have you copied content from someone else? have you included proprietary coded links belonging to someone else? think hard of what you have on your pages, and you will most likely come up with a reason why google is rejecting them ( this includes having lots of masked keywords on a page)