I have a site that is optimized very well for it's target keywords, and it generally ranks in the top 10 for it's keyword phrases in Google. Lately I've noticed that it's on an "every other day" schedule. Meaning that today it will come up number 8 in G, tomorrow it's not in the top 100, the next day it will be number 7, the next day gone. Why is my site doing this, and is there anything I can to help stop this. On the days that the page is gone, I get almost no uniques, and make ZERO money. On the days it's listed, I make $10-$25 dollars. I'm still building links and doing necessary updates and off page SEO for this site. Anyone else suffer from this syndrome?
There isn't much you can do, considering Google uses multiple data centers your not always going to be the same position. When you see yourself at #8 another might not see you at all, this is the common nature of search engines. What you need to do is stop being dependant on organic traffic, it should exceed no more than 33.3% of your daily traffic if you plan to be successful. What if algorithm changes are made next month that keep you out of the top 100 for good?!
I see your point about the organic traffic, but with the niches I'm in, it's a lot harder to get traffic from referring sites as opposed to search engine traffic. SE traffic is great, because it's almost (if not more) targeted as referral traffic. Believe me, I try to get it from wherever I can, and I've become proficient enough at SEO that I can usually achieve good site rankings, so it just kills me when the SE's fluctuate rankings drastically.
I feel that fluctuating search results is going to become norm shortly to help provide better results, which in turn would be true. See, most of my websites I'm able to link into wikipedia articles, since I do heavy editing on wikipedia I usually don't get my links bumped, plus I make sure they all follow the neutral point of view standard. Wikipedia brings almost 500+ visitors a day to my site in my signature as an example, not a single article I external link from is based on opinion or a how-to, it's valid proven and cited information.
I've had this issue twice before and I don't exactly know what really fixed it. Here are the steps I took. 1. Removed all my dead links, 404's etc. 2. Created and submitted Google sitemap. 3. Reworked some of my usually static pages to be more dynamic and added lots of content relevant to the specific page vs. the entire site. 4. Amened the robots.txt file to restrict certain areas of the site. 5. registered the domain for 10 years. 6. Added 200% more content. After this, I haven't lost my first page rankings since. Hope this helps!
#1. 404 outbound links won't have a effect on your sites rankings in search results, and 404 pages during the crawling process via Googlebot would be reported in Google Webmasters Central index stats. #2. You only need a XML Sitemap if your site is new and doesn't have any backlinks. Once a site starts to mature and build a lot of backlinks you don't really need a XML Sitemap anymore. Proper internal linking will void any need for a XML Sitemap. #3. Smart move, more content is always the best way to rank well in search engines, which is above proper quality backlink building. #4. I don't see how this could be a solution to the problem. #5. Useless, registration time has no effect on search results, if it did spammers would abuse this and it would be removed as a factor anyways. #6. Ditto from #3 reply.