All my forum permissions set to where you have to register to view them (guests and not logged in cannot view). Is it true that google bots are treated as guests and therefore will never register all the new content in the new posts because they automatically ignore them?
Yes, content that requires user login will not be crawled, indexed, or cached. You have some options around this, see the link below. the last paragraph is most relevant to your issue. http://www.google.com/support/webma...536&query=crawl+restricted+pages&topic=&type=
Thanks Robert. Can anyone tell me how to do this? I'm on a shared server and I cannot find where I might accomplish this in the admin CP. Can I change my robots.txt file to do this?
Implementation of this requires some care. It is simple enough to use javascript/php to check the user agent and ip addresses, identify googlebot (etc) and not redirect the bot to the login page. That said, serving up different content based upon user agent can create problems that resemble cloaking which can get you removed from the index. So, if you don't find someone that has experience with vbulletin here you might ask at the vbulletin forums.
Google bots didn't know how to register and have their own log ins. Of course, there's no privacy at all when all log in protected sites and their contents can searched on search engines.