Ah, sorry... was thinking of phpNuke and thought you could adapt them... But you should be able to do that anyway: download the mods at look at the code.
Yeh, There is a strange error I get in this apache server. On my other sites Mod_rewrite works great which does not require any sitemap or I can create my own sitemap manually, but for a site with 12000 articles and other jokes and other category its just a pain to sit down and do manually. I tried sanke seo, you can see, www.bollyheat.com/sitemap.php this works fine but the links dont! when some one clicks on it it says mod not found. Maybe I have to relook at my .htaccess file! Thanks!
I did a sitemap for a test project, and out of 4000 pages, they seem to have read 500 pages, although none of them are in the index yet.
Working well, after I submitted by sitemaps [manually created] google has started indexing my pages again! one thing I learnt for dynamic pages You need not have full site pages in the sitemap, only a link to these pages will do in sitemap! I have only 8 entries in my sitemap and in one day 3906 pages indexed!
I was able to successfully include every page from my invision power board. I wrote my own php page to generate the xml file so I could include static content. I then setup a curl cron job to call the page (with querystring input required so that simply accessing the page won't generate the file). The results have been very promising very early.
Would u mind care to explain how you did that? The querystring thing, cuase when you enter mine it almost freezes my sitemap.
im not quite sure what you guys are talking about but you might want to put some delays into your program, php has a seconds delay and a milliseconds delay function so check that out
btw of 400 pages in my sitemap 386 are now cached, and I suspect the straglers will soon be cached as well. This from about 30 pages cached a couple weeks ago.
Hi Minstrel I have been considering the worth of the Google site map With no use of the Google site map - just added heaps of new pages to a forum and very soon after Google found the main forum pages, yet to find all the threads Google often spiders based on the Google PR of the site - the higher PR, the more often, earlier and deeper it seems to spider. So if you are a low PR site, you can wait quite a few weeks for new content to be found by Google. And even if you are a higher PR site, you can still wait quite a number of days for Google to find all the new pages that have been added. So Google site maps can get new pages found faster and it is less dependant on the Google PR of your site as to how fast those pages are found and spidered (IMO). Where Google ranks a page is a lot about Google PR and link text for that page. So once Google has found all those new pages - if those pages have a lot of link text/anchor text for certain important pages on your site, then those important pages will rank higher earlier. You need to make sure that you have some good links from existing high PR pages to those new pages, and good html site maps, so Google will calculate in good PR for those pages you have created. It is always one thing for a page to be cached by Google. It is quite something else for it to then be ranked high by Google. So how about it Minstrel - is that good reason for why Google site maps are a good thing?
I was wondering if it is okay to submit more than one sitemap per site? ex. one for forum, and another for the directory.
Sure. I have my php page, let's call it a.php. The code in a.php generates b.xml which is my google xml sitemap. I have setup a cron job to call a.php once every day so a new b.xml is generated everyday. In case anyone accidentally found a.php, I programmed in an if statement that requires a querystring variable in the URL so that if you try to view a.php, nothing happens. If you access a.php?action=generate then the code is executed and a new xml file is generated.
Using google sitemaps has been very successful for me,especially on large sites. One site went from 286 pages indexed to 130'000 indexed, another from 80'000 to 540'000. Took about 4-5 days after submitting the sitemaps, and googlebot started visiting, adding about 30'000 entries per day to the index.
Following on from the excellant work of Kalius, I have put together a vBulletin Google Site map that has pagination for the forums, threads, archive and archive thread version. All items accessable via the config variables on the file. Yet to be added is support for multi site maps for large vB sites. Google site map for vBulletin The copy of it on my personal site: vBulletin Google site map
If I read the posts in this forum correctly, those of us hosted by GoDaddy are left out of sitemaps. Somebody tell me it ain't so and how I get my host to read a $ command, and run python. Thanks. My direct email is
There are many ways to skin a cat. You do not have to use the Google version. You can use php and create based on interrogating a database for url's (ie like has been done with vBulletin Google site maps). Or you can run a normal site map spider over your site and manually use php or even excel to create the xml for the site map. Always lots of different ways.
Submitted my first sitemap to google for our site that has dynamically created pages. I am planning to submit it everytime I add more pages or make content changes. Not quite sure what difference it will make. Anybody else with experience?