PR doesn't make sense to me, I guess, but I was just thinking. A page could, theoretically, get to PR10 from a huge number PR0 links right? Well, think about this.... Couldn't you write a 1-time php script, loop it about 1 million times, and do the following: run srand and query an SQL database of about 1000 random entries of words about 100 times, so you will have !100 different possible randomly generated web pages. fputs the content into a new random filename ("name" + srand(%10000000) + ".php") and dump a link to your site there. Wouldn't this generate 1 million backlinks with mostly unique content? I wouldn't dare consider doing this, but following how PR works, would this work?
You would have one million links comming from the same website. That won't help at all. And the amount of space you would need to do that would be huge. Google isn't stupid
Yes. Spammers do this all the time. It generates backlinks, but not PR weight. A page cannot pass on more PR than it itself has. So you would need to build inbound links to each of your spam pages. The time to do so would be better spent just building links to the site you want to rank well in the first place.
I guess I have a bit to learn about PR. I thought internal links did count towards PR ranking, even if the links were from meaningless pages.
"A page cannot pass on more PR than it itself has." ""PR cannot be created nor destroyed"." I don't think this is true, atleast according to the original papers on PR. I believe the original idea at the beginning of this thread would work in theory - it wouldn't be a PR 10 though. http://www.webworkshop.net/pagerank.html (obviously not a definitive source, but I've read this in several places.)
I've toyed with similar ideas. For example, a tarot site that created a page for every possible combination of cards and pulled the interpretations off a database, or an I-Ching site that did the same (I wrote a similar programme for Atari ST years ago, very easy). That way you would actually have content, but without knowing Google's algorithm for assessing duplicate content I would be cautious. Be better to use a dynamic page with mod_rewrite though I would have thought. Or how about adapting one of those random poetry generators? A new page for every poem generated