Hi there I currently own a Forum and a main site (on seperate URLs) In the UK the main site does really well (for both rank and traffic), yet the forum couldn’t be worse. They both have approximately the same number of links, I have targeted different keywords for the different sites but the Forum hasn't even got close for the keywords (vbulletin) The only real problem I have seen is that I have added some pages in the site map, but then restricted them with the robots file. Would this cause a problem with SEO?
The only issue is the XML sitemap will error out in Google Webmasters Central, the robots will follow the robots.txt if they of course follow that standard.
So having errors in the Google Webmasters Central doesnt mean google will lower you ranking for any reason, it just means google is making you aware that the search engine cant crawl the specified pages becuse the robot text doesn't allow it? I still can figure out why the fourm is doing so badly (private tutors link in my sig), it doesn't have a bad pr or number if inbound links.