The poll is for the thread's title and the thread discussion is: how do you deal with getting directory pages out of supplementals? I personally added some content for categories that have no links in them, and in one week I drooped supplemental results from 90% to about 50%
I'm having good success with tweedy the description to pass copyscape Test. I have 3 directories that I been recently doing this. Its more work but in the long run I know it will pay off. www.keenj.com www.foxcabin.com www.human-edited.com
Hello... Not really sure but i would guess 10% or less but not positive as the last time i checked i didnt see many in www.sleekdirectory.com so... thx malcolm
really? hows that... as the supplementals dont start till page 90 something in the google results? and it only goes to page 100 somthing... Plus its got more then 11700 indexed pages.. Also if i may say in that sites defence is... that it has been dropping in and out of the serps so maybe thier is an error somewhere..??? http://www.google.com/search?q=site:www.sleekdirectory.com&hl=en&safe=off&start=860&sa=N also if i might add... if you look at the cache date most havent even been thier and many changes have occured since those update shown so... thx malcolm
It's very high, but I'm still waiting for the full effects of rewriting my urls on 2 directories. I will do the third next week, and see what kind of impact that makes.
Malcolm, compare the results of these two searches: http://www.google.co.uk/search?num=100&hl=en&q=site:www.sleekdirectory.com&btnG=Search&meta= http://www.google.co.uk/search?num=...ctory.com+*+-jdfgkljadlkjfg&btnG=Search&meta= I'm also seeing 11,200 out of 11,900 in supplemental.
im also seeing this in the second page but not first.. i have recently done seo threwout the site changing various things... wonder if it was something i did right or wrong... will look deeper and figure out why and whats going on... thx malcolm
The most common way to check for supplementals is to search on site:http:// www. sitename.com ***-view (without the spaces!)
I suspect it will help you most to look up Adam Lasnik's comments on crawl budget. I think this issue tends to affect directories more than most websites, because they tend to be large and are prone more than most to duplicate and almost-duplicate pages.
Interesting... I just checked google webmasters and it says the crawl is fine... i also noticed a new tool thier for ..... URL Removals To remove content from the Google index, do one of the following: * Ensure requests for the page return an HTTP status code of either 404 or 410. [?] * Block the page using a robots.txt file. [?] * Block the page using a meta noindex tag. [?] Your content will then be removed from the index the next time Google crawls your site. If you need to expedite your content removal, make sure you have met one of the requirements listed above, and then select the New Removal Request button below to use this automated tool. As i said ive just done a good bit of seo and i cant see making pages any friendlier or for that matter adding more content as it has more then most websites do... I would only assume this is a google error of some sort as about 2 weeks ago It showed 9000 + errors at the webmaster tools they offer for sitemaps and then i redid the some of the urls as well as added a NEW sitemap and now it shows only 22 errors so... But of course i will look into it further and see if it starts clearing it up as i have just recently added News feeds to help and ensure that doesnt happen.. and yes they are relevent thx malcolm
Thanks, it worked. Most of the supplemental pages are due to RSS feeds.. Is there any way to avoid that..
According to FeedBurner, this is supported both by Yahoo! and Google. <xhtml:meta xmlns:xhtml="http://www.w3.org/1999/xhtml" name="robots" content="noindex" /> Code (markup): I didn't test it though
I noticed that on Malcolm's pages as well. It's not that the feed itself isn't supported, but the feed is probably taking data that is already displayed elsewhere on the directory. It's not new content, so it's possibly considered to be duplicate. But I think the main issue with feeds is likely to be their low PR, since they're just not as important as the category pages, and they're not likely to be linked as extensively.