I am just getting into web 2.0 and all I hear about is tier linking... This is what I do know... Money Site.... Tier1 - High/Grade Authority- Web2.0 / Article Directories / Social Media, etc (Too Money site) Tier2 - Wikis, Blog Comments, Book Marks (Too Tier1) Tier3 - Spam/Trash/forum/profiles/pings, etc... (too tier2) So if i am missing something then please add to the list so I can understand better. and most important, how do you track all this confusing link building? Thanks
Hey there, In 2014, I'd recommend the following. It's what works for me and also for the SEO friends of mine that typically like to keep this stuff a secret. A) Money Site <- Web 2.0 <- Web 2.0 <- Lower quaility Spam (Article directores, comments, forums) B) Social to the first two levels (Facebook, Tweets, Bookmarks) The reason that Web 2.0's are used for the T1 and T2 is because of their high domain authority. The DA acts as a protective buffer. For example, you can typically spam high DA sites (Youtube is a great example) and not yield any penalties. Keeping two levels of 2.0's between your spam and your money site is a great safe measure in 2014. Here's a list of Web 2.0's that I recommend that have high stick rates and good DA. Beeplog EduBlogs Jigsy Jimdo kiwibox live journal postbit pumpbuddy skyrock Soup IO WebGarden Wordpress Tumblr As to your question about how to keep track of it. If I were to do everything manually (which I don't recommend), I'd be using Excel. Instead I'd invest in a tool like FCS networker to create, manage, and post to my 2.0s. Hope this helps. Diggity
Would it be better to go a bit further? Tier1 - Web2.0 > Money Site Send High quality links to tier 1 Tier2 - Web2.0 > Tier1 Send Med Links to Tier3 Tier3 - Web2.0 > Tier2 Send Low/Spam links to Tier3 The more I read up on all this stuff, the more confusing it is... I have run across posts back from 2012 saying that web 2.0 does not really work, and goolge has banned some web2.0 sites....
Unfortunately I am tad swamped but I will chime back in with "why" shortly.... You can safely stop reading that stuff. It was not accurate in 2012 and simply not how you do things. If you can label something a "link scheme" its just that. First it was reciprocal links, then triangulated, then something something, then linkwheels, my personal favorite lol.. You can not rank things by hurling links at the problem. It simply demonstrates a complete lack of understanding of how to rank something. http://moz.com/community/q/how-to-create-effective-backlinks-and-promote-very-small-sites My current SEO client base I have had to pull out of the gutter all because they either hired someone from here or Odesk to do that sorta thing or they have been reading the same plop. It would be nice to start something clean out of the gate and avoid these issues but I am going to be polishing some other folks turds for weeks to come apparently. Fortunately I sorta have the algo figured out for the time being so its not too entirely hectic. It pains me when folks hit up and I discover that this sorta thing is what has been done to sites. I have said this no less than 100 times around here.. I essentially make sites build their own backlinks(not kidding either). I can't remember when the last time I actually "built a link" was but I am certainly going to have to do a bit of that shortly. I will try to expand on that if I can free up some time this evening. I am making unusual headway for Monday. I would start by camping out on moz's Whiteboard Friday page, go back about 2 months and start watching. Nigel
Interesting read, fair enough I guess, but how does google know you have quality content? of course with their high tech algorithms, variables, constables, etc...how does google know what is good content, bad content, etc...so if I get news syndication into my blog, then that is considered bad, low quality content? something doesnt jive... So does my site have poor quality content? http://yourgamingstore.com
First, news syndication is not unique content. Google doesn't like duplicate content. Next, quality content is determined by keyword density, readability, bounce rates and total time spent on a website, traffic, where traffic is coming from, how your site and its content is doing on social media outlets, and the quality of the links pointing at your website. Other factors, such as page speed (for example), are also considered. Instead of spending time tiering links and engaging in backlink campaigns, spend time doing social media marketing for your site, spend time writing high-quality content, spend time connecting with your visitors/customers and forging relationships. Google is constantly updating its algorithms to identify paid links/unnatural links, etc. Besides, if you have a ton of backlinks, but poor bounce rates, a poor social media presence, etc. Google will be able to identify that something doesn't add up. Besides, if your site isn't high-quality, then it does not matter how well you rank on Google. Customers are key.
If your site isn't doing well, here's why: First, your content is not unique, you've simply scraped it from media outlets. As far as Google is concerned, you have NO content on your site. Second, the layout is a mess. What is going on with all the pictures and headshots and strange links here: http://yourgamingstore.com/blog/com...-secure-gears-of-war-for-the-xbox-one-forbes/ You have amazon affiliate banners on the side of an ecommerce site... no need for these since you have products on the site. Your store page is a mess - why is it made this way? Why are categories listed multiple times? http://yourgamingstore.com/product/ You have a refund policy but it looks like you're an Amazon affiliate store? Your Facebook and Twitter links lead nowhere (in the footer). They're just share links. You have no social media presence. Your website is also slow... run it through Google PageSpeed. SEO is the least of your concerns, to be honest.
in that case, only time will tell then... Too be honest with you nothing is really... once it is posted, its not unique anymore.... I would love to go on and on about this, how everyone thinks they know what google likes, dislikes, loves, hates, and I realize syndication is not unique, but it is to my site...unless google expects to whole wide world to show and go straight to the source of unique-ness... I hate to burst bubbles, and god knows I am no seo guru, but most of the internet is snydication/duplicate based, so they never rank at all? Why are they online? just because? and the last story about unique content that I read was about how he used duplicate content, then he went to unique content thinking it would help, and little changed, same traffic, stats, rank, etc...
Short version... because its useful content and others talk about it. What is inherently useful depends entirely on the demographic. Philly IM largely is on the right track here. I would posit keyword density is a pretty useless metric though. Folks should not be thinking about it or stuff starts reading funny. Nigel
Thank god someone gave me some feedback. I have been looking for feedback all day... sorry to say, people are not perfect unless google wants to be exactly like them, oh wait, they do....
I would agree. To clarify my use of 'keyword density,' I mean that if you have one that is too high it can be detrimental. The 'ideal' K.D. is much lower than most believe, I suspect. All content should be naturally written and designed for human readers, not for search engines. No problem. Good luck with your site.
The thing about duplicate content, even with it being duplicated, it probably still does not reach the masses even for them to talk about it...what maybe duplicate to one person maybe something new to another if they should come to the site via indexed pages... I run across stuff all the time that is years old, and it is new to me simply because i may have just never seen or ran across it before.... If you read it years ago, then it is duplicated content, but if I am just finding it, then it is new to me.... really depends on how you want to look at it i guess...
With W3 Cache on, site was given a page load of 80% (load under 5 sec) (its not simple 1 or 2 calls, there are mutiple calls, and processes that are taking place) With W3 Cache Off, site was given a page load of 73% Now consider this, hostgator....