Hello DP Forums and all Shawn Hogan fans, A site I am working on a site that has 22,000 pages indexed by G. When performing a site:www.weichert.com, only 4 pages are returned. The rest are sent to supplemental results, and I can only view the indexed pages by clicking "repeat the search with the omitted results included". The mystery lies in the fact that most of the pages are unique, static, and original content. Pages that are not often are still showing up. Could there be a problem in the source code that is pushing my site to supplemental hell?
Don't panic just yet! Have you created a google sitemap yet? You may want to try that first. How is your info coming up on other SEs? You may want to also create individual pages to all your listings as search engine will not autofill forms for you!
Check out Ideas on Washing Out Supplemental Results by RustyBrick at http://www.seroundtable.com/archives/002722.html which discusses how one might be able to remove some of those supplemental results found for a site command.
You need to have the pages more distinct. You title and meta data seems to be the same. Try making each page as unique as possible.
I noticed that all the pages erroneosly have </META> tags which does not exist in HTML. Could this be tripping up the googlebots?
definitely get rid of that tag (</meta>) .... but also heed to ServerUnion's advise - page titles etc
diggin through the omitted results, there are sooo many unique pages. why are they being omitted? are they not unique enough?