A common suggestion that I read concerning getting out of Google's supplemental index is to get links to those particular pages. Whether that would work or not I really don't know, but at least that technique seems doable if you're talking maybe 10 or 20 pages in the supplemental index. However if you're talking about releasing hundreds or thousands of pages from the supplemenal index, then I don't know how you go about getting external links to each of those pages. Has anyone ever tried creating a website "B" that is nothing more than an external sitemap to another site "A"... "A" being a site whose pages are mostly/all supplemental? I was wondering if this would provide the external links needed to get site "A"'s pages out of supplemental. I would think that placing this external sitemap site "B" in your DP signature might be enough to al least get it crawled. Has this been tried? Waste of time? Thanks.
Waste of time. Wouldn't be natural linking that Google is looking for. Look at links as votes. The internet is split up in pages or documents, not in websites. Anything that wants to rank, every page, every document needs votes. And not many votes form the same guy but many votes from many different guys and girls. So... All your pages need links from ideally a lot of websites.
I have found that unique content and meta data on the page got me out. HTML Title meta keywords meta description alts on images Text on page I was able to do this on a few sites, if there are lots of pictures involved, I have found that appending a number on the pic didn't seem to make it "unique enough though. So doing this programaticaly is a little harder Have you submitted any XML site maps? I would be interested to see it your external site map idea works. keep in touch.
wel my sup pages 103 are linked using content internal links, external links all pages are linked to each other, nothing seems to work 5 indexed 103 not and some not at all which i understnad due to them being new and linked to sup pages and indexed pages. ah confusing, need tips so your idea may work, as for whether its been done before , don't know.
You will need all unique head tags and content on your pages if you wish to retrieve from supplementals or you can try washing. Washing Supplemental Pages 1. remove all content from supplemental pages, 2. wait till Google caches blank supplemental pages. 3. Add content back to pages making sure to use unique meta data. (If you are in supps for dupe content... adding it back will make a vicious cycle for you...get original...think outside the box) 4. Build a link to the supplemental page(s). 5. Wait till next cache see where you are then. May require rinse and repeat....till you get the content mix right
Check out Google webmaster Group. There is some very interesting experiments going on into internal pagerank distribution and getting out of the supps. Some threads are well worth a read.
do you have unique meta tags, description ans titles on the pages? this could also put them in supplemental.
now 41 of my 103 indexed pages have left the sup index traffic has tripled , in short I removed links to my own sites only concentrated on one main topic throughout unique titles meta descriptive alt tags added more backlinks related to my topic to each page limited my sitewide links unique content and lots of it errrmm thats it i think , well thats what i assume helped me
Try to change content on site... including Metas.. and give some trusted quality PR links to particular supplement page