I have read in other forums that Google will not read or index more than 50 links per page. I was just mentioning this to a friend of mine and he asked, what upon reflection is, an obvious question. Does the 50 link limit -- if that is accurate -- apply to both internal and external or is it only external. If it applies to internal links that would means site maps would have to be broken up into 50 link groups.
It's not accurate. There are of course aesthetic reasons for the user, but no technical ones. Google can and will index every link within the first 100k of a document. Anything after 100k is truncated (all content). - Shawn
Google has also an official recommendation about that http://www.google.com/webmasters/guidelines.html Keep the links on a given page to a reasonable number (fewer than 100). that recommendation is repeated once more on that page. however i have noticed that many OTHER bots do FAR less than 100 links per page ( as little as approx 5-10 top of the page links ) while i have watched very FEW other bots doing many hundred or even almost my full site at once. to overcome such problems i ADD manually all new pages on the sitemap on top for a while.
Yeah Compar/Hans, That's a good idea. I too have asked this question - in particular, in relation to a site map - though not often considering secondary bots. That'll explain a few things. I think I'll do exactly that - rotate new links into the topmost position. All round it's the wisest thing to do Thx !
You just have to love the way Google tell you to keep your page links below 100, on a page of their own with about 140 links on it ;-) OWG
Is there a site that can simulate something close to what google bot sees, including the 100K limit? 100K is pretty big page though.
OWG did I miss something? I cant find 20 links on that page never mind 140. I guess you'd say its just cos I'm from bristol tho ;-)
Generally, the question is not so much of "how many links should I have on a page" as much as "where do I want this page linking to".
Hi Bob - I was thinking particularly in terms of Search Theme Pyramids, where visitors, bots, PR and the like, are all dealt with in narrowed subject ranges. Another consideration is that the keywords used in anchor text appears to be used to count in a very specific way towards the keyword weighting of the page. That's why some SEO's break up headings in linked headings, ie, each heading has a "#name" - or else link out to resources based on the same keyword, which is something Aaron at www.SEObook.com makes a very common practice of. Worrying about number of links is primarily an issue for focussing on issues of PageRank - more specifically, PageRank loss. However, as PageRank is seeming more and more diminished in the view of many webmasters, use of internal and external links to emphasis a specific theme for a page, seems to be an alternative strategy that has come under more focus.
It's actually 101kb or 100 links... But to see your page the way Googlebot sees it: http://www.delorie.com/web/lynxview.cgi?url=http://www.digitalpoint.com http://www.delorie.com/web/lynxview.cgi?url=http://www.yoursite.com Code (markup):
as far as I am aware google wont spider more than 100 links on a pr0 page and more than 50 links per page results in a much reduced PR and link power passed through the link. this applies to both internal & external.
I don't know anywhere Google says that PR makes any difference, or even that definitely that they won't spider beyond 100 links -- they do recommend no more than 100 links and that MAY mean they won't go beyond that number (on a PR0 or a PR8 page) or it may just mean they, like most people, are warning you more than that many on a page is just a bad idea for many reasons.
I'm with Minstrel on this one. You'd have to show me definite proof. I'm sure there is absolutely no connection between PR and how much Google will spider. The connection with PR is how often Google will spider. The high the PR the more frequently Google will crawl the page. The only limit with Google is the size of the page. If the page exceeds this size -- 100kb I think -- then any part in excess of the size limit doesn't get spidered. But again this is completely independent of PR.
Ok here is an example for you, I had 2 directories, both close coppies of each other, on one I have a PR4 with 55 internal links and 34 external links. the other had a PR 0 with 55 internal links in the same config as the 1st directory and 52 external links, both these directories had external links to sections of a website I had redesigned and they were linking to the new sections before it went live and before there were any other links to those sections on the net in order to improve spidering and SEO the first set of links from the PR4 directory which had links like site.com/HongKong.asp for specific country jurisdictions had all the pages spidered in a little over 3 weeks. where as the second directory linking to the usa jurisdictions Eg. (site.com/usa/usa.asp?jurisd=Alabama)from the PR0 directory only had 6 of the external pages indexed in 2 months before the server went down. these new links went up within a day of each other and each directory was spidered within a week of these links going up. now perhaps it was some unknown other variable but im fairly sure its the PR and I have heard a couple of other people mention it on other forums as well. if you would like to carry out your own test I eagerly await the results.
So google will not lower your domain rank based on the links page even if it has a lower indexing rate and PR due to the excesive number of links on it, it still can't signal google to lower your domain rank or even ban you right?, sorry I know I have many questions.