Google's algorithms are sophisticated .... a lot more than you deem them to be. They read the content on your website and any website linking to you with 'similar' content will weigh more than a link from as website that has less similar content...
how does the google bot determines similar content? I don't think anyone can make an algorithm to determine content content similarities!! match some highly used words from both the sites and count the number for that particular word?? maybe its just a scare tactics used by google?
I think it is not quantity of backlinks but quality of backlinks that helps in increasing web site page rank.
No, It's real .... quality links with similar content that's what you need .... take for instance my new forum: forums.delphi-php.net The domain was registered on the 29th March ... on the 30th April it reached PR5 .... The reason wasn't the amount of backlinks but the quality of backlinks from similar sites ..... the backlinks I got were not more than PR5 and most have the "no follow" tag attached... So how is it that I could have reached PR5 if Google wasn't taking into account link quality and link content? Check out http://www.backlinkwatch.com and see for yourself.
For PR you need a few links from high pr sites, or a lot of links from lower PR sites. That's what the graphs are about. For PR the subject of the pages doesn't matter - as long as they are actually indexed (and stay indexed). Let's not confuse this with: For getting found in the serps the subject of the pages that link to you, the quality of your site, the text in the links, the technical layout of your page (etc) matter a lot.
Acutally they can. For example adsense puts relevant ads on a site which means that they can find out what a site is about.
you might can get pr 1-3 from pr 4 link alone and only few links from higher PR site can get you a higher PR result as well. it doesn't need to be lots of site but what PR they got. liek they said it's quality over quantity.
It's simple: quality is better than quantity. Google reads your contents and bases a PR on that too. How could that be proven? Nobody knows the google algorhytm, right? Well, if PR wasn't (partially) based on content of your site (keyword relevance), some guy would already have found a mathematical 100% correct algorhytm to calculate PageRanks. The fact that nobody found a 100% working algorhytm proves that the PageRank calculation is quite difficult, and it must rely on contents / keywords too ... The backlink charts I see here, are all based on content irrelevant backlinks I guess. These charts can never be completely right, I believe To get them right, we should know the calculation, and nobody except the google guys know the calculation.
copyscape is for finding exact same content not to find similar, related contents. google bot determining similar category, content site and copyscape finding exact same content in other sites is two totally different things.
Exactly. It is, in my opinion just one of the hundred of myths that people enjoy repeating but have absolutly no evidance for. It just makes people sound like they know what they are talking about. I have never seen any evidance that PR or SERPS are effected by relevancy. I do however think that Google does have a weight assigned to a sight based on things such as age. This weight effects the points it passes to a site it links to.
there are so many table on the internet for this but i dont think that they show the real. you must focus on getting quality backlinks