I started noticing this back in early to mid 2013... I have a site ranking for a highly monetizable keyword and many long tail variations for it. There's one page that targets the main keyword, and sub-pages each individually targeting specific longtail versions and related keywords. Let's use our good ol friend 'dog training' as a hypothetical example. The site has something like this: site.com/dog-training site.com/dog-training/canine-hunting-tips site.com/dog-training/show-dog-obedience-guide/ etc Although I had individual pages targeting each of these smaller keywords, Google was ranking one particular page of mine higher than all of them. Namely the top of the silo (site.com/dog-training in the example above). Google is giving preference to one single page that covers the broad topic, rather than individual laser-focused pages. The Test 3 months ago I created a sister site (we'll call this the deep site) targeting the same keywords as my original site (we'll call this the wide site). Brand new URL. The only difference in the deep site is that there are much fewer pages, but each page is jammed full of various keywords. Note, I'm not saying keyword stuffed. But inclusion of each word that can create a long tail variation + many synonyms. The average word count per SEO page is 932 while the average word count on the wide site is 387. Deep Site Characteristics Targets: 33 keywords (same as other site) 8 pages of unique content. Only 4 are useful pages (i.e. not the about or contact pages) Onsite strategy: Conservative 2014 standards Backlink profile 8 referring high PR/PA/DA domains Average PA: 33.2 Average DA: 24.2 Wide site characteristics Targets: 33 keywords (same as the other site) 37 pages of unique content. 33 of them are for each keyword Onsite strategy: Conservative 2014 standards. Same as above. Backlink profile 12 referring high PR/PA/DA domains Average PA: 38.9 Average DA: 28.6 Results Deep site traffic Wide site traffic Conclusion Despite being a 3 month old domain (vs 3.2 years as with the wide site) and having considerably less link juice pointed to the site, the new site with deep pages is outperforming the wide site with many pages. And the gap is widening. The newer site is trending towards more and more traffic each week. Google is giving more preference for individual pages covering the entire niche, which in turn covers sub-topics in that niche. Another benefit is that since theres less pages to rank, fewer backlinks are needed. Much more mileage is gotten out of each high PR/PA/DA link. Caveats: Of course this isn't a 100-site testcase, full blown statistical analysis and there's many variables that can cause a discrepancy. Such as: exactly how powerful are the backlinks going to each site? Each site has unique content. Is one optimized better? But I think the kicker here is the age of the new site. A brand new, 3 month domain should not be out-performing a 3 year old domain using the same off-page SEO strategy, unless its got something going for it. In my conclusion: deep, keyword optimized pages.
Very interesting results, indeed. One explanation for this could be that Google favors pages with more content i.e. a page with 2000 words article covering 4 keywords will rank higher than 4 pages covering one keyword each. It is a smart assumption on a part of Google, if somebody makes an effort to write 2000 words then this must be more informative (read: useful) for the end-user to read. I assume your "deep" site has large pages - you need a page quite large to cover 33 keywords, hence it ranks better than short pages
You have got the point right friend.... The word counts in each blog post or article plays vital role.
DiggitySEO your threads are always of so much value ! I love reading them. I have also noticed that google has started to prefer topic pages instead of specific keyword pages. I am going to start creating my pages based on the info you just provided. It makes sense. Thanks!
Great post and test. I agree that Google is more for topic pages than 100 pages focusing on 1 keyword per post.
Update. With the latest Panda 4.0 update (May 17-18, 2014, http://searchenginewatch.com/article/2345884/Google-Launches-Panda-4.0), the gap between the "Deep low page count site" and the "Wide high page count site" widened significantly. SERP placement improved 10-45% on all keywords and traffic rose 73%. This confirms speculation that Google was relaxing on-page factors for the benefit of the smaller businesses, and thus smaller websites.