I have heard recently that they are going to change the search engine optimisation to a new system that is very clever that analyses the actual content and will disregard all pointless repetitions of keywords. Is this True?
from what i've read, that is correct...it's latent indexing i believe. SE's, especially google, will ignore OBVIOUS keyword optimization and look at your copy as a whole to determine it's theme... ...it's just another step towards the SE's trying to deliver more quality search results.
It might be true if what you have in mind is really latent indexing. And I hope it will soon come in practice because what we call know search keyword optimization with the keywords repeated a hundred times is pure madness.
That is partially correct.... Although, the search engines are constantly changing their algorithm so can you really say this is any different then before?
Didn't this process begin years ago already? Gone are the days when you could type Viagra Viagra Viagra at the bottom of the page and rank for the term. Or... are they?!!
I attempted a non-mathematical explanation here What is Latent Semantic Indexing (LSI)? Hope it helps. - Michael
Well, it makes a lot of sense. I have always insisted that everyone should write for Humans and not Machines. Whenever a content writer asks me about Keywords I just shrug and tell them to write a good and compelling story instead. Why? Well over time the Machines have only one goal, and that is to become like Humans. Search Engines will try to understand the text and make sense of what is written. The ultimate goal is to approach the behavior of a human reader. If the buzzword is LSI or Neural Computing or something not even invented is not really the question. In reality it is all about understanding the text the way a human does. So by writing content for humans, I am perfectly prepared for the future. Honestly, I cannot wait until all the machine generated spam is gone. On the other hand, as the robots become smarter so will the spam generators, which is o.k. at some point if they can generate content thats really useful. But whats the point? Everyone will generate a couple of million useful pages and then what? The Internet is gonna drown in its own spam? Again the Robots will fight back by increasing the importance of Trust Rank and social media sites to sort through the crap. Thats another aspect of the future of SEO. Social media, personalization and sorting of content.
Andre75 rightly put, articles should be for people not for machines. All that "keyword optimization" does is try to game the search engine results.
Thanks everyone well put andre I agree the search engines will just keep refining it down for content to make the surfers happy.
Their machines get smarter, my machines get smarter. It's an SEO arms race. Social media is even easier to spam than the older systems.
Not necessarily. It all comes down to profiling the social media users. You will probably submit your own sites much more often then others do. These things are relatively easy to filter out. Your machine might get faster but not smarter (since you are programming it ). You won't have the resources to keep up with them. As I said, there are always holes to be found, but if they can truly understand the content on a site, then we will finally have useful search engines. If the useful text is auto generated, so be it. But it would have to be better then the resources it scraped and that is next to impossible.
That's no problem. I can have thousands of identities, come from IP addresses all over the globe, and delete cookies faster than you can say Rumpelstiltskin.