I was wondering if it was possible to make it to where Google completely ignores certain pages on your site and doesn't crawl them. I'm pretty sure there's a way but I guess I forgot because I've never done it.
It can be done by Disallowing googlebot. You can do this by adding "disallow" to Googlebot in robot.txt file of your website.
Good advice above - all about the robots text file. Also, If you wanted the page to be indexed but no PR passed to it you could "no follow" any internal links to the page.
Agree with the advice above + you can add .htaccess to the pages you don't want to be indexed. "nofollow" tag will not prevent google from indexing the page as their may be other dofollow links to that page from other sites.
To block a single file being crawled, add the following line to robots.txt: Disallow: /subdir/your_file.html
You can control it at the page level by putting a meta robots tag with noindex in your page head section. Here's an example that won't index the page or follow any links on the page. <meta name="robots" content="noindex,nofollow">