My site has 1000's of pages, but only a few hundred are indexed in the serps. The "deeper" pages are less used, but contain good content as well. It looks like the google spider never gets to those pages. Is there a way to improve this?
An xml sitemap pinged to G will get them indexed (probably supplemental though if there weren't getting indexed before). More link popularity will help them rank (or get indexed if not using a G sitemap). Deep links will help too.
For Google in particular you should try to create an XML sitemap and submit it to google. Otherwise you can create an HTML sitemap and include a link to every single page from your sitemap. This will help with other search engines as well as Google.
The sitemap suggestions above is the first move (I guess - I don't use sitemaps myself), but the obvious next step is starting to get deep links into your content.
An XML site map and external inbound links are both solid ideas. Also consider: - use of a "related pages/content" tool. For any given page, it suggests related articles based on keywords and content. From an SEO perspective, this creates more page-to-page links ... very helpful. - featuring deep pages occasionally on your home page for 1-3 days. This raises their profile with Google et al and will get them indexed. - using a tagging tool that allows you to create ad hoc categories along with your site's formal taxonomy. Again, this helps create page-to-page links. Separately, don't fall trap to the practice of allowing all your "deep" pages to have identical page titles and no meta-tags. They may be obscure, but they will still need to each be optimized to the best of your ability.
I have dynamic pages (php) and new pages occur everyday. Is it possible to make an xml sitemap that keeps track of this and updates the sitemap when new pages show up or when an older page changes?
1. add a google sitemap 2. link to all your pages on your site from somewhere else on your site. (make it like a web of links, get it....web! hehe)