I know how to use forums when link building, but I know some forums have bots blocked. I know there is a program out there that allows you to take an overview of a forum to see if it doesn't allow spidering.... Anyone know where I can find the info I need? or give me guidance if my knowledge is wrong? Thanks!
In a web browser type in http://<www.yourforumdomain.com>/robots.txt and see if you have any directories set to disallow. You can also view your forum page source to see if the links are nofollow or not. Another way to check this is the site:www.domain.com search command. You can use this command in Google or Yahoo to see all of your indexed pages. Pages will only be indexed if there is nothing blocking them. You can also submit a sitemap to Google Webmaster Tools to see if there are any other problems affecting crawlability/indexability of your website.