I block visitors that decrease bounce rate and load my server without conversions. These are visitors from Google Image and those who set my site as a homepage. Are there any negative efects I might not see from doing this? Do you block any visitors?
Don't just block visitor's unless they are doing malicious stuffs with your website. I only block visitors that post spam on my website.
Thanks for the answers. Typically users that set my website as a homepage start Explorer or Firefox between 5 and 20 times a day. They leave the website most of the time after the homepage loads, increasing the load on the server and the bounce rate. I think that Google uses the bounce rate to rank websites and I like keeping it under 35%.
no my dear u can not block any user because ur website is worldwide and any one open it you canot block anybody
Than your allowing harvesters and bots to travel through your doors without any clue of the why and who they are, or even what they are doing with the mateirals contained on your pages.
Each webmaster must determine what is beneficial and detrimental to their own site (s) and react with their own objectives in mind. These objectives may vary between two webmasters as different as day and night. Many webmasters are utilizing a combination of "white-listing" and "black-listing", while others are still focused primarily upon "black-listing" (which is what is being discussed here).
Every visitor that comes to your site (with good intentions, of course) is valuable, even if he leaves immediately. I don't think that your Google rank will affected by the bounce rate. Bouncing is a fact of life, and I guess that Google is interested in other things about your site, for example, if you have stolen material from some other one.
I rarely block users manually. Usually honeypots do the job quite well and badbots catch most of them.
In eleven years with my sites I've never, ever seen two things: 1) A log spammer request robot.txt 2) a log spammer utilize a User Agent which would allow entry a into robots.txt The most effective action against log spammers is to simply ignore them. These folks don't have a clue of how the www and raw logs function today (at least for 99.99% of websites) anyway. Any action of denial, merely seems to provoke these pests to more attempts. BTW, robots.txt is a request for compliant bots that are willing to respect protocol, which applies to most any bot (whether it identifies itself as one or not) and with the exception of the major search engines (whom readily identify themselves; at least in most instances). htaccess is an effective control that does not give the visitor and/or bot a choice, rather simply a denial of access. (basically a firewall)
from the Honeypots web page: Honeypots are a highly flexible security tool with different applications for security. They don't fix a single problem. Instead they have multiple uses, such as prevention, detection, or information gathering. end of quote The majority of "so-called" bad bots today have taken to the use of "so-called" (at least from their perception) standard User Agents to achieve their harvesting.
There are some people submit into a specific website just for SEO purposes. You can track their IP and try to block them by doing it rather. But dont block visitors because they are the reason you gain traffic for your website.
Just as the content of your website (s) doesn't appeal to all visitors? Nor. . .do all visitors appeal to webmasters. Each webmaster must determine what is beneficial or detrimental to their own website (s). And with the above in mind, denying access to selective visitors, locales or even entire continents are certainly a wise choice for any webmaster.
Do not block visitors because they are a part of traffic for your website. And your website PR will not be affected by the bounce rate.