Please may I know the right reason as why my sites' www.mysite.com/products/ is in Googles' supplemental results, whereas the proper index page - www.mysite.com/products/index.html (FYI, this is an example URL) is not in SI. What would be the reason. Please can you tell me as what corrections need to be done to come out of SI.
supplemental results are the pages which last lost the quality for a limited time. the main criteria for the supp. results is page rank. you can read to learn more http://www.seo4fun.com/notes/supplementals.html
Set up 301 (permanent) redirects from one page to the other. For them both to be in the index, it means you have links to both of them. The redirects will combine the links & pr into whichever URL you prefer. If you're using an Apache server, you can do this ( pretty easily ) with mod rewrite.
A page goes in to supplemental results when the Google bot sees 2 pages similar pages. In your case www.mysite.com/products/index.html is same as www.mysite.com/products/ which is the reason one of the pages is in supplemental results. To avoid this all you need to do is change the link to www.mysite.com/products/ wherever the link goes to www.mysite.com/products/index.html.
Well, just yesterday i noticed that a lot of my pags went to the supplemental results. I haven't done anything blackhat. no duplicates, etc. So i think Google is having issues with the site command again? Have to find out.
One thing I noticed that on my shop site was even though the content was mostly different I would have 20 pages main index and 90 or so supplementals. Luckily the supp's pages contained very niche products so even though they were supplemental they still showed in the first few google pages. TABLES! Because so much of the site was using old tables, the coding threw it in the supp's.. Oh well, live and learn.
i hear this all the time in alt.internet.search-engines supplemental results are due to duplicate/similar content on your pages - just make sure theyre a majority different and if possible keep updating them on a regular basis! its as easy as that and supplemental doesnt mean your not indexed or theyre of no value it just means theyre similar to other pages on your site... thats all... nothing more.
Absolute nonsense. It has nothing to do with whether or not the page uses tables. Google Hell? By Matt Cutts Tue, May 1 2007
haha....what crap..........TABLES.........SHEEEESH .... Nonsense dude, My sites are all tables, some nested 4 deep and I have not one page in the supps......
Crap? Not at all But I didn't explain correctly. The table code is on each html page and not in an external CSS file. This throws the content/code ratio out big time, imagine what the bot see's at each crawl - 100 or so pages the same. Say 85% the same and the content 15% different. I know the tables should be in an external CSS file, but this was done on a premade package and I didn't know any different at the time. The next step is moving the domain away from these people. It's nice how people want to jump on the bandwagon and start to flame people round here
What the heck are you talking about? Which site is this? The "home page" in your profile is a WordPress blog...
Not that site. www.dws-sat.co.uk Just take a quick look at it in notepad or something. When the site was put up, every page has all the code in. So in my eyes it isn't going to help really. That was my reasoning anyway
The code isn't that big, external CSS isn't going to make a discernible difference, and using tables isn't the problem. However, the links in your top navigation menu are problematic (I realize you have text links in a vertical menu as well but you really shouldn't need both). If you want to see what spiders see, try one of these: http://gritechnologies.com/tools/spider.go http://gritechnologies.com/tools/diagnostic.go?www.dws-sat.co.uk http://www.stargeek.com/crawler_sim.php http://www.webscale.com/cgi-bin/timer.dll?SessionID=19660744&View=Overview&Mode=Engine You'll find a lot of additional information there on what spiders are seeing on your site.
Thanks Minstrel, great tool. I'll go back and re-think content descriptions etc...., give them more indepth info. Recently more stuff is coming out of the supps. Thanks again
I have at least a few pages in the supplemental index. I haven't done anything black hat, but I have a page devoted to each photo, with some of the basic info about it ( shutter speed, aperture, location, etc ) and some comments. In some cases I've been a bit lazy with the comments, and the 'bot really can't tell two photos apart, so a few of them have fallen by the wayside. Unfortunately, in the case of Iron Horse State Park, one page has a low PR and does okay, while another page, with a much better photo, is in the supplementals. I need to go back and write much better comments describing the place, the technique involved in shooting the photo, and things like that, to make the page unique enough, and then hope when GoogleBot finally returns, it's more to their liking.
Thanks for all your views. Further more, hope you all are aware that there will not be any SI labled listings on Google Serps. SI is now gone and will not affect rankings on SERPs.