hi, i want to use robot for special paragraph of any page to prevent it form google crawler to crawl it... Is it possible. please suggest me how i can do it.
No it is not possible. there is no way you can block certain part of a web page through robots.txt. What you can do in this case, use JS to include the content you do not want to index on that page.