How many links should you look to get per month. With a lot of people saying too many links too quickly is not a good idea, how do you decide how many is optimal. If on the other hand, you get too few, you will never catch your competitors. Also, is there a list of directories that offer free one way links and not reciprocal. I know those are very hard to find. I know Google is not too keen on recips.
I think it depends on several factors. If you have a big, popular and established website, you can get a lot of links a month without any problem. If you've just started a website, Google will frown if you gather hundreds of backlinks within a couple of weeks.
To my opinion, if the sources of getting the links are clean, the impetuous growing of their quantity is not supposed to evoke any suspicion in Google's eyes. There're some other factors that may impact on one's site's search ranking, but they've got nothing to do with that. Another question, what should be kept here is consistency, as a lot of old links without new ones are of little help. Concerning one way links acqusition methods. Articles' submission should be undertaken too, of course.
http://info.vilesilencer.com has exactly what you're looking for. So far I haven't seen a penalty from too many links per month, but I have no real evidence of benefits from it either. The free version of my software (in my sig) adds a link to the its download page. I'm averaging 15,000 new links per month at the moment, but without a Google backlink export, I can't tell how effective 15,000 exactly alike links are. What I can say is that when I look at my Google Sitemaps statistics that the software download page is my page with the highest PR. I'll know a lot more once a TBPR export occurs.
I haven't yet reached the max I guess. My carcasher entry page gets probably 5 new backlinks a day and it's staying pretty fresh. My computer cables website is getting probably the same and my ranks keep growing there, and my newest one, the Steelers 12th Man has yet to be indexed so we'll see I guess slow but steady seems to be a good plan.
15,000! Surely, 15,000 links a month is asking for trouble as far as Google is concerned. This is way, way too excessive.
I'm a little concerned about it myself, but not that much. I'm not overly concerned because I try to build traffic without relying on search engines. One look at my Alexa rank will show that I've had some success in that area. I never thought my software would be as popular as it is. The next PR export will tell.
Here's the results from two Google searches (one from a bigdaddy center, one not): http://216.239.51.104/search?hl=en&q="submitted+with+article+distributor" http://66.102.9.99/search?hl=en&q="submitted+with+article+distributor" And this is from my Google Sitemaps Crawl Stats page:
Id say 15,000 is not too bad. If you exchange sitewide with someone, and they have a website that has 15,000 pages, you can get that many links in a day.
Yeah, these are essentially sitewides. 95%-99% of the links reside on only 300 sites. As the articles get reprinted by other sites the links will disperse more, but given that they're all the same anchor text, it really isn't as helpful as it sounds.
I think it all comes down to consistancy and variation. For example a popular blog like matt cutts's blog would grow alot of links per day through reputation and word of mouth. But people would be linking in with all sorts of anchor text. If they are natural links, I dont think there is such a thing as too many links per day. However doing recip links, variation would have to be one of the biggest contributers. Brad
It is an interesting question. Consider this; Supposing that a goverment launched an "essential online resource" for that country, should they expect to be penalised? Or, would the likes of it be reviewed by a person? And, will it be seen to be a fairer system, whereby one or more abuse systems are married together, to reach such a conclusion? And the extent of one or more of them? There may be exceptions, understandably, but in general the current system does seem to hold some understanding of the many variables which can happen by accident. It is no wonder that the s.e.'s need scientists to find some sort of equitable system, when the size and complexities of the task, are considered.
I posted some related information in this thread: http://forums.digitalpoint.com/showthread.php?p=652550 It doesn't appear that 15,000 links a month hurts you.