I found out why I did so bad on the latest pr update and why many of my sites dropped in ranking. My ISP has been blocking the bots from accessing my sites! I’ve been with pair.com for seven months and have been working my butt off trying to figure out why pages are falling off the engines. This in turn causes a drop in traffic, earnings and other items such as DP weight, etc. My Google sitemaps have been reporting errors which clued me into the problem. It’s the only way to tell that they are blocking by address. It’s Google’s way of telling you why you are not being indexed. I contacted pair and they said that they have been blocking bots because of the load the bots are putting on the server! They had Google’s IP address still being blocked in their rules when I contacted them and only release it if you ask. Because the block is handled by a separate group, they have no way of notifying you, you have to ask! So, if you’re with pair, dedicated or virtual server, you’re rankings may have nothing to do with your efforts, but rather the rules put into place by your ISP.
Quality relevant backlinks are only needed i think . no other factors influence PR besides Quality backlinks.
Fucks sake that's appauling behaivour by an ISP. I hope your sites regain their original positions. Personally I'd take my money elsewhere on principal - they're obviously more interested in cost-cutting where they can rather then thinking about their customers needs. Who in the right mind would consider stopping the Google bot from spidering their customers websites? Honestly. Pete
Yeah, but I think the point he is making is that page might not even have to exist for them to have page rank. So, spidered or not they should have PR in the google internals. I'm not commenting either way, because I'm not sure. But I thought I'd point that out.
That's some serious BS! Their a big host also, wonder how many people are getting screwed over because if it. By gosh, if you've got a VPS or Dedicated you should be able to do whatever the hell you want with it.
i think it was meant for me .. Pair.com has a very good reputation , if they have done this , it is really very bad for such a high profiled company.
I'm assuming that PR would be impacted from blocking the googlebot, but I am 100% positive that your ranking and number of pages indexed would take a major hit. I also use coop and noticed my weight dropping which I'm guessing has something to do with the number of pages indexed (or not indexed). I went with Pair because I heard they had a great reputation. A lot of big players use their service and I was totally blown away when I found this out. The whole point of Pair.com was to improve my rankings, not kill them! I've been in the process of moving my sites off of pair. It is very frustrating because I spent so much time setting them up, setting up databases, security and more. I did really bad on this PR update (I know, PR is not all that, but I still get excited) and couldn't help think about the crap that's been going on. Regardless, I was shocked that an ISP would block IP's like this! I'm venting, but thought I would let others know what goes on behind the scenes and bring this kind of behavior to light.
That's disgusting. If you've lost earnings over it I would contact a lawyer. If in their terms and conditions it states *explicitly* that bots will be blocked they would have no legal ramifications, if not you could sue them on the basis that it *probably* an implicit term of contract that an isp does not block bots (based on the industry standard of not bloacking them). Comb read http://pair.com/policies/ to see if the explicitly state they block bots. If not and you think it's worth it (depends on your earnings i guess!) contact a laywer. Let us know what you decide to do and how you get on. ~Jamie found this http://www.calendarscript.com/support/forum/Forum2/HTML/000916.html not sure if its relevant, but it seems there are other cases of this sort of behaviour
I wonder if it affects yahoo too? I have one site (out of about a dozen) at pair, and it is the ONLY one banned at yahoo. (for 2 yrs. it shows index page only) Always thought they were expensive for what you get...maybe time to move.
FYI - The way to detect if your ISP is blocking the Google bot is to set up a Google Sitemap. If the bots can't access your site, you'll be notified.
Hi, this is Kevin Martin from pair Networks, Inc. pair does not by any stretch of the imagination have a policy to block search engines from customer sites. I cannot fathom any sane reason for doing that. There are instances from time to time where a spider, for whatever reason, hammers a site or server, and we block it. Those blocks are intended to be temporary and are to protect the service. And yes, even Google spiders don't always do a good job spacing out their crawls; on a dynamic site, Google can take down the entire server pretty easily. It's rare, but it does happen, and it's never supposed to be a long-term problem. If your specific case hasn't been resolved by support, please contact us, or myself directly, so we can investigate. Again, we do not block search engines in general, and I would be surprised if any Web host does so. Thanks, Kevin
Kevin, This has been a problem for months and every time you (Pair) say that you'll notify us, but every time you block the IP address of the Google Bot and then forget to remove it. It is only until I see the sitemap error that I have to write you guys and only then do you remove the block. You say you don't have a policy of blocking search engines, yet you block the bots if they generate too much load. It doesn't take a stretch of imagination to figure out this is the same thing as blocking the search engines. You have stated in the past that you have no way of placing temporary blocks on IP addresses. I understand your intention is to make them temporary, but they never are; if I have misunderstood our communication on this point, please take the time to explain exactly how you place a temporary block on the address. If I have a dynamic site at Pair, does that mean the server could crash under the load of the googlebot. In the past, your emails stated that the googlebot was hammering the site and you had to block it to prevent a crash. I also understand you are willing to work on this, but that has been the same response over and over. My goal here is to bring my situation to light so that others can check for similar problems.
TooHappy, I have to chime in here and say I ran into a very similar experience with Pair. The googlebot was blocked for over a month without my knowledge. I know this hurt my sites quite a bit. I finally noticed my sitemaps in google where showing timeout trying to download my sitemaps. Then I started to notice in awstats the googlebot had not visited for a long time. I do feel this issue has been resolved with the tech staff, but it was quite unsettling to find this out.
Just to let everyone know, fasthosts (who claim to be the no. 1 hosting company in the UK) also do this. Their 'customer support' have claimed via email that they do not block any spiders - but looking through my raw log files, I can see that when googlebot, msnbot or yahoo slurp come to visit, they get a 403 forbidden error. The site works correctly for ordinary users, and has been validated a strict xhtml. http://www.mini-organic.co.uk Amongst the comments from 'customer support' "if the site works correctly for users there is no problem" (missing the point entirely, how will users find the site if the search engines can't index it) "The 403 errors are generated if there is no default document in a given folder. If you have disabled friendly http error messages in IE this will show as 'directory listing denied'." (again, missing the point, the file that is showing a 403 error to the spiders is index.asp, not a folder, second, it is not a user browsing the site, but a search engine spider, third, I'm pretty sure that the search enginge spiders do not browse with IE ) "We do not block spiders from visiting our sites - most people want their sites to be picked up by spiders and we even sell a Traffic Driver service to help promote sites in search enginers so this something we would not be blocking." (maybe they are just trying to force their customers to use this 'traffic driver' - expensive, and claims to submit sites to '400 search engines a month' - doesn't sound like good SEO practice to me, as the major search engines tell you to submit once, and multiple submission can get you blacklisted) I tried some of my own testing using the tool at http://www.smart-it-consulting.com/internet/google/googlebot-spoofer/index.htm , which shows what each search engine bot 'sees' for a site. It returns a 403 error for each 'bot that I tested I then uploaded the same site to a subdirectory of a site hosted elsewhere, ran the googlebot spoofer, and found that each of the 'bots could now access the site correctly. I don't really want to move the hosting, as have had problems finding a host with good uptime at a reasonable cost, but if I can't get this resolved, I may have to. Any suggestions? JennyJ
I was considering using Fasthosts for some UK sites but they can stick it up their arse now. I can't stand the thought of HOSTING companies who pride themselves on being the best doing such deceitful things as this to save themselves a few quid at the expense of their customers websites. Pete
Pete its a very serious matter if a host is blocking a search engine bot, as we all know we need those bots. I have a deticated server with fasthosts and if for one second I feel they are blocking the bots I am removing my account asap.