Only time I've seen 403's on known working sites is when: 1) the robots.txt has disallowed the bot from accessing key content 2) the .htaccess file is denying content 3) someone is using countryblocking software to prevent traffic by location, or 4) some type of leechhammer script is running to ban bad ip's via router/firewall (seeing bots as offensive on resources). I'm sure there could be other explanations, but that's what I've run into previously.