According to Google: "The software behind our search technology conducts a series of simultaneous calculations requiring only a fraction of a second. Traditional search engines rely heavily on how often a word appears on a web page. We use more than 200 signals, including our patented PageRankâ„¢ algorithm, to examine the entire link structure of the web and determine which pages are most important. QUESTION: We have discussed about links to relevant reputable sites, and a few more SEO optimization issues. Which are these 200? Can we compile a good list, to have the best possible understanding? According to Google: Hypertext-Matching Analysis: Our search engine also analyzes page content. However, instead of simply scanning for page-based text (which can be manipulated by site publishers through meta-tags), our technology analyzes the full content of a page and factors in fonts, subdivisions and the precise location of each word. We also analyze the content of neighboring web pages to ensure the results returned are the most relevant to a user's query. QUESTION: Do you understand the issue with the "neighboring" pages? What about one-page web sites? Do you understand what they mean when they speak about the "precise (!) location of each word? Thank you for your time.
Meaning that if in the first paragraph of your content you talk about cat food, their is a chance your site is about cat food. More then if cat food appears as the lst words of your content only.
I think the neighbouring pages issues is a question of an evaluation as to whether this page on your site is a better match to the search query than that page, bearing in mind that a website that covers a particular subject is likely to have keywords and phrases repeated throughout the site, particularly in term of text links etc.
I would say it means other pages from your site pointing to that page. Kind of providing a reference in case the visitors want to find out more info. However, I can't say much about the precise location. Maybe it analyzes how the keywords get spread around the article and distributed.
Neighboring pages is about "themes". Google tries to determine the theme of your website by the co-occurrence of words on each page and the neighboring pages. For example, if your site uses the term "white house" google expects to also see the term "president" or "united states" either on the page or on a neighboring page. Having the related words clustered together in nearby pages strengthens the theme of your site and helps you rank for the term "white house".
"What about one-page web sites?" I think one page sites are not liked by google. Google love content where in one page sites, there is not much content, these sites are just like sales page.
it's been known that Google has ignored keyword stuffing for years, but their formula seems to change regularly. The appraoch i adopt is to create webpages for visitors not for search engines.
yea neighbouring pages is about building themed groups and the internal linking from each page to each other. internal linking is one of the really key factors in ranking these days IMO, seems to be more than ever, you can really make it or break it with this
Exactly. When I read the thread title, the first thing that came to my mind is "no one". Believe me, it's not only US that is trying to uncover the secrets to Google's algorithm. All of their competitors are, too. We will never really know what "Google likes" and what "Google doesn't like"; we can only look at what's happening with other sites and use that information the best way we know how. It's so much better to design our webpages for our visitors; word-of-mouth is one of the best advertising techniques "out there".
I think its best to design it for the search engines primarily but ensure visitors are satisfied. The search engines will eventually rank pages accurately according to visitor satisfaction but at the moment its an algorithm which is not entirely based on this. Its getting there but I believe a site made for google will still rank higher than a site for visitors, over time adjust it to try an match googles algorithm as it improves until we reach the point where all the sites on the internet are made for users/visitors.
neighbouring pages is about detecting if a site is dedicated to that thing that the user are searching. Maybve the user search for "mp3 players" and one site has this word on their text, but the site is not dedicated to software or players, etc. Maybe another site with the word in the firts page has as its main subject the music players, so this second site is probably more relevant.
Any time you get into specifics of Google's algorithm, the answers you will get are educated guesses at best. No one really knows, except maybe Matt Cutts and he's not talking.
Yeah, try building your site specifically for the visitors, not the search engines. If the visitors like what they see and stay longer, Google and all the other search engines will notice that. It will help you a lot. Remember, they are in the search business. That means that they want to give their visitors the best result possible. Seo ain't gonna do that! Your content will!
Getting high ranking on google SERP is not about understanding the complexities of google algorith, it is all about doing the well known seo techniques right. If you can write high quality content and get lot of organic links pointing to your site, you will inevitably rank pertty well. Too much of digging deep into the minor things is not really helpful, IMHO.
It's not a matter of Like/dislike - Google like all the new and innovative things ! If you have a site with hundreds of thousand pages & the content is not that good then it'll not give any importance to that content ! I got a site with only one page , Just a 200 words content on it and it's having PR4 with Position in Google serps ..
google makes changes in algo almost daily, so even if one understands it now, after 1 month, things can be different
That is true, although the core structure I believe would remain the same. The search results generally move about but good sites stay up at the top of SERPs