Google's John Mueller has stated that Google crawlers rely on the cache of the robots.txt file. This cache is updated approximately every 24 hours. So before you add new content that is "disallowed", you need to add it to your robots.txt file for 24 hours before adding the disallowed content to get crawled. This will insure Googlebot has the updated cache of your robots.txt file and obeys the "Disallow" command.
I had disallowed on my site search pages Disallow: search/ also same way admin pages but Google still managet to index some of them....
This is really great update from John Mueller. This will motivate the webmasters to be more organized and planned in website updates.