When I orginally set it up on a phpbb forum it wouldn't validate even though the ads displayed fine so on a hunch i deleted my htaccess file and tried again and it validated first try. I put the htaccess file back and now a few days later i get the 'failed validation' email so i guess there's something in my htaccess which blocks automated checking so here it is: AddType text/html .shtml AddHandler server-parsed .html Options -Indexes +FollowSymlinks +Includes RewriteEngine on RewriteCond %{HTTP_REFERER} ^http://www.addresses.com/ [OR] RewriteCond %{HTTP_REFERER} ^http://www.iaea.org$ RewriteRule ^.* - [F] RewriteCond %{HTTP_REFERER} ^-?$ RewriteCond %{HTTP_USER_AGENT} ^-?$ RewriteRule .* - [F] RewriteBase / RewriteCond %{HTTP_USER_AGENT} ^amzn_assoc [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Web [NC,OR] RewriteCond %{HTTP_USER_AGENT} Templeton [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Openbot/3.0 [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*NEWT [OR] RewriteCond %{HTTP_USER_AGENT} ^ComputingSite [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^ibm [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Pioneer [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Boris [NC,OR] RewriteCond %{HTTP_USER_AGENT} FrontPage [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^MicrosoftPrototypeCrawler [NC,OR] RewriteCond %{HTTP_USER_AGENT} MSIECrawler [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^NetResearchServer/2.7 [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [OR] RewriteCond %{HTTP_USER_AGENT} Zeus [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^NPBot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InternetSeer [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^EasyDL/3\.04 [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Microsoft\ URL\ Control [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR] RewriteCond %{HTTP_USER_AGENT} almaden [NC,OR] RewriteCond %{HTTP_USER_AGENT} Link [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^ia_archiver [OR] RewriteCond %{HTTP_USER_AGENT} ^ProductionBot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^DIIbot [OR] RewriteCond %{HTTP_USER_AGENT} ^LiteBot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Zao [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^TurnitinBot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^libwww-perl [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^trademarktracker [NC,OR] RewriteCond %{HTTP_USER_AGENT} HARVEST_VERSION [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Python-urllib [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^EducateSearch [NC,OR] RewriteCond %{HTTP_USER_AGENT} DTS\ Agent [NC,OR] RewriteCond %{HTTP_USER_AGENT} email [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Mac\ Finder [NC,OR] RewriteCond %{HTTP_USER_AGENT} Java [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^OfflineExplorer [NC,OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^lachesis [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^NaverRobot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Crawl_Application [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR] RewriteCond %{HTTP_USER_AGENT} Sqworm [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^jetboy [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^exabot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^gazz [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Mozilla*4.7 RewriteRule ^.* - [F] Code (markup):
No referrer and no user agent usually spells trouble, someone up to no good, perhaps the Validator should identify itself.