Hi, As a company we're a bit tired of giving loads of money via PPC to Google et al. So we have embarked on our natural listing SEO program. Over the last two weeks I have experimented with two types of pages. Both are 'optimised' for one keyword only, ~900 words of content in body, ~2-3% keyword weight. One page is html, and the other is an identical .asp file with an extra inserted RSS feed from a search engine return for the keyword in question. Now this has been done to experiment as I said, but also to reduce the dupe content % between the two files. In addition I have began slowly adding links with good anchor text in via text-links-ads, avoiding site-wide. In 2 weeks I got to 5th and 6th for the index of our domain, for the htm in Google, and on page 1 of Yahoo and rank 8 for MSN for the .asp. Everything was fine until yesterday when its been flipped on its head with a massive drop in Google SERP and page 1 , rank 1 listings for MSN. Now I am wondering if the dupe content filter has kicked in, as I had a link to the htm files off the home page, and a link to the .asp on another page linked off the home. The .asp RSS content may have returned a link my site in its feed recently as I ended up top of MSN (if you are still with me ).. 1) Have you ever gone page 1 at the beginning, then dropped massively the next week. 2) Is this inserting RSS for SEO a good thing anyway? 3) Can I buy just one set of site wide links to get me started. Some sites you see have thousands of inward links, rubbish repeated copy, everything on their page which google doesnt like, and they are number 1? Thanks, MJ.
i must add that i only created only 3 pairs of .asp / .htm. its not as if I created hundreds of these... what did happen is every pair dropped out of the google rankings sharpish, from a very quick page 1 placement... if i had done something wrong, why did google rank me so high in the first place.
Its hard to say exactly without looking at the site, but the situation you are describing, makes me worry about the duplicate content filter. The dup content filter will mean one or both othe pages are ignored. Which could result in a loss of the SERPs. That being said, you may not have anything to worry about just yet. Sometimes the SERPs can jump around a bit if there is some disagreement between the data centers. I would do make any drastic changes for the next week or so. Make sure that the SERPs have settled down and you are indeed not getting ranked well - then start making some changes.
Brad Callen talks about this in his latest newsletter update. http://www.seoelite.com/Lessons/ArticleMarketing.htm Basically it looks like the search engines want to cut down on RSS feeds being duplicated all over the web and reward those that publish original content. Good luck.
I can't say for sure what happened with your site, but RSS could be factor. I had my RSS feed based site 4th in SERP for more than two weeks for very competetive keywords (was earning around 30$/day in adsense) and one day boom.....entire site banned from google index. Have no clue what happened, but since then I am very careful with RSS (aka scraping!) (btw, it didn't matter me much as I was experimenting with RSS feeds in site, so I am not very bothered about waste of domain)