I read in SEObook.com that Google is capable of differentiating between which portion of a webpage contains the main content and which contains the accessories .Is that true ?If so how does it detect that ?
I am sure it would be possible for google to detect navigation bars on side/top and bottom by comparing multiple pages on the websites. But otherwise detecting main content vs accessories doesn't look easy
Guess its easy: 350 words in a row (main content) and just a bunch of lose words here and there... (menus etc)...
I agree that it wouldn't be too difficult to compare multiple pages for shared blocks of code. What do you think Google does with this data though? It is believed that they use this to give less weight to sitewide navigation links. Do they also give more weight to links in the content section? What if you made a point of making the block of navigation code unique on each page? It wouldn't be too difficult to keep rotating in different link anchor text. Do you think that would fool Google into giving navigation links full weight? Has anyone here had success with varying the anchor text of sitewide links?
Easy. Break the page into sections based on frames/divs/tables, calculate the ratio of html to visible text for each section, throw away all sections with too high a ratio and index the rest. fava
google can't identify the page structure if you don't optimize it (I mean on-page optimization). You should have a proper use of <h1>, <h2> ......<strong>, <em>, etc.. I think google can identify the top navigation (as it contains the same link all over the site) and google always looking for the top left content first .... that explains why most blog template sidebar or navigation is on the right.
google gather all such info on the basis of html tage.so keep working on managing it to do tricks with google spider.
Yes they can. The can also see what is common between pages like nav bars. But they are very good on seeing what content is copied from someone else. And that is at the sentence level.
google does maintain a strong commitment to keep duplicate content out of there search. So I believe that they are probably constantly improving on there methods to fulfill this commitment.
Wouldn't be difficult, Overall it would heavily load the servers though. But I'm sure if anyone could handle it, It would be google.
Seoobook is one of the most reliable recourses about SEO industry if not the most reliable... Ad of course he is right and they got the servers and the techniques to handle it What they can't do easily YET is spot paid links unless of course the guy who buys the links is doing it so obviously that even a 10 year old would be able to spot it...
You may be referring to a home page and then the auxiliary posts? In that case, you want to build up SEO on the posts as well as on the main page in order to bring the entire page rank of the site online.
Yes, they can.. and if their bot can't then surely their evaluators can do this for keyword ranking.. Don't forget that they have more than 10000 evaluators over the world
By evaluators do you mean humans who work for Google so as to improve their SERP ?? Where did you get this data pal ?