Hey great thinking Tyler, Right or wrong (and it looks good to me, in so far as I understand things) it has focussed my mind on getting rid of all reciprocal links among my own sites. One small comment, when you say: that applies only for an even number of sites, not an odd number, (unless i got my sums wrong) Cheers
Very good point! Thanks for catching me on this, I had completely overlooked it. When it comes to an odd number, each site will have the same number of links. Let's use five as an example: [(5 x 0.5) - 0.5] x 5 = 10 1 links to 2, 3 2 links to 3, 4 3 links to 4, 5 4 links to 5, 1 5 links to 1, 2 or seven: [(7 x 0.5) - 0.5] x 7 = 21 1 links to 2, 3, 4 2 links to 3, 4, 5 3 links to 4, 5, 6 4 links to 5, 6, 7 5 links to 6, 7, 1 6 links to 7, 1, 2 7 links to 1, 2, 3
SEOChat ranks for SEO... and it uses recip links with the other sites in the network, which links are coded in html. so they aren't necessarily bad. This is just another (more advanced) strategy for people looking for a different way of linking.
I observed that you theory avoids both reciprocal and 3 way links. If you would include 3 way links how many total links would there be in total? A quick adaptation of the formula would be: [(n x 0.5) + 0.5] x n I might be wrong though. Any thoughts? Regards, George
Well this does help. Though it should be pointed out there are 100 pieces to the algorithim puzzle so some relevant one way links to your sites on the same c class can help a bit.... You are not going to land 1 1 for viagra using the method described... Spread it out to several different c class servers the effect intensifys.... but You are not going to land 1 1 for viagra using the method described... Good job link baiting Peace
But isnt there a risk of getting penalties from SEs for interlinking our own sites? I read somewhere that the different class Cs may not benefit for long as G is getting smarter and looks at whois info? Or am I being extremely paranoid?
Cool idea, I'm glad to see someone else thinking the way I do. I think there's a potential problem, though: 1 links to 2, 3, 4, 5, 6 2 links to 3, 4, 5, 6, 7 3 links to 4, 5, 6, 7, 8 4 links to 5, 6, 7, 8, 9 5 links to 6, 7, 8, 9, 10 6 links to 7, 8, 9, 10 7 links to 8, 9, 10, 1 In this scenario, 1 links to 6 which links 7 which links to 1. This is a straightforward "three way linking procedure" that I think search engines like Google would be able to detect. --- My solution? What I'm writing right now is a PHP script that would do the following: 1. Make it so that you start out with the program by entering a site on the network, it’s specifics (PR, etc.), and assign it an ID. Then have the user enter all the sites this site links to that are a part of the network. 2. Use the add tool to add a second site. 3. Enter a proposed link for the new site 4. Make sure this target site is not a site that already links to the proposed linking site. If it does, let the submitter know and make them say they are sure. Do this by both domain AND IP. 5. Make sure this proposed site does not link to a site that links to the proposed linker. If it does, let the submitter know and make them say they are sure. Do this by both domain AND IP. 6. Make sure this proposed site does not link to a site that links to a site that links to the proposed linker. If it does, let the submitter know and make them say they are sure. Do this by both domain AND IP. 7. Make it so that if a submitter ignores a warning message, this link is identified with a flag as a potentially detectable network link and make sure the admin is aware. 8. Make it so the program looks at sites and sees which other network sites are eligible link-mates. That is, have it check which pages the site could link to without throwing up a flag, and list those off so that the submitter knows what he can do to make the network more productive. ----- Unless I'm making a mistake, the program above would prevent up to four-way linking... I doubt the search engines check further than that, but who knows. I'm sure mine has some problems too, and it's working programmatically through trial / error just because I'm not good at coming up w/formulas.. Input?
Sounds good... Just to clarify something. The idea for this works for 10 of your own sites only? AND using different IP's for each? Sub-Domains in that case won't work well, am I right? I'm asking this because I'm considering the idea of adding sub-domains to my site made up of the different categories of articles. I have over 20 categories of articles. Making sub-domains may make it easier for SEO. I'm not sure that's a good move or not... just thinking out loud. again... I like the line of thinking - Thanks for the info
I would not say penalties till you go to far....but yes there would be a filter placed if its found you are interlinking all of your sites and if they are not relevant a further dampening filter applied.. penalty would probably be placed if you linked every page of your site A to every page of your site B and vice versa, and both to site C and so on. One link from each site should suffice..... Peace
I'm glad to see this thread is generating so many quality posts. Although my equation is obviously not perfect, my main objective was to generate good discussion, so I'm glad I posted the equation
I tried adding a link to my blog post to an article on Wikipedia, but big surprise, it got deleted in about a day