I built BungeeBones.com to be search engine friendly. It is a web directory with over 1600 categories. But each display, of each category, has its own unique title and description. The search engines index each category as if it is its own individual page. It would be a waste of time I believe for me to make a sitemap for each one of those pages. Is there any point to making a sitemap for something like this? If so, is there any software that would accomplish this? Or should I just code my own and make a page with 1600 links on it?
Am using a little php script that runs through the database and creates a xml sitemap in regular intervals
Thanks tricky noses I'll check it out. hape, I've studied xml and never seem to get anywhere with it. I was thinking it would be a good way to cache some of the directory pages instead of hitting the db all the time