Hia all I have just tryed running one of the site map generators(A PHP one) but it only picksup a few of the pages. My site uses PHP and has a directory of cars from the database and has a page for each car, then there are also the other standard pages (contact us and so on). I take it making a site map for this kind of site is much harder that running a sitemap generator. Thanx John http://www.lease-hire.co.uk
Yeah I know but when there is a lot of pages and they change depending on what cars are in the database it would the involve quite a bit oflooking after to keep it upto date. If it is not accurate and upto date surly I am better off not having one at all?
Maybe you can have a look at *ad* phpSitemapNG */ad* . There you can just create a special input plugin that connects to your database and feeds these information into the system that will create the sitemap file(s w/(o) sitemap index file). If you have any question or problem regarding phpSitemapNG - drop me a private message. Google doesn't penalize you when the sitemaps content isn't accurate. I'm always using the timestamp of the sitemap creation for all files - Google doesn't seem to react on this. I would call the sitemaps just as a hint for Google, it's not more. Just another way to submit a list of urls. But you should make shure that they comply with the Google rules, otherwise you might point Google to the problematic pages - and that might cause trouble. Tobias
Hi all I am new to SEO field.... My client site is hosted on linux server and previously he has used PHPNG sitemap generetor.... is it mandetory that to create sitemap for linux hosted site to use PHPGN sitemapgenerate tool, can't i use other tool like cup coffee,or sitemapgen..... well i am not comfortable with php and phpgnsitemap gn so it's hard for me to do settings ..... can any body guide me how can i do that....
Hi James, thanks for giving phpSitemapNG a try. What exactly is the problem? Can you give more information about the problem - that would help to solve the problems. Did you manage to install it? What kind of website do you / your client have/has? Best regards, Tobias
I made my own php script, it's good that my script crawling every single link, but have time out problem..
Don't build a sitemap that crawls your site - it should be crawling your database!!!! I use sitemaps on all of my sites and the worst one is where the site is hand constructed and not using a CMS.
Even if your website resides on a Linux server (and EVEN if your website is partly database driven) most sitemap generators (crawlers) should work just fine! You can use your favourite Windows/whatever tool for the job. Are you experiencing any problems using e.g. Coffeecup Sitemapper, A1 Sitemap Generator, GSiteCrawler etc? (for more, check: http://code.google.com/sm_thirdparty.html)
I have a large database of images, but they are not really named in any meaningful way. What would be the best way to get those indexed?
Everyone try this automatic free deep crawling sitemap generator. http://www.auditmypc.com/free-sitemap-generator.asp Easy to use and fast. You can further use PHP and an xml parser to generate html sitemap for your site viewers.
You could add the images to the sitemap (need to check the rules on that though) but the index doesn't handle naming or indexing criteria. To get them indexed meaningfully you're better to have an html sitemap with their links titled and alt'ed - and add the html sitemap to your xml sitemap
I recently read that having &id= means that that page won't be crawled by google. Is this true? If so do I have to have a stab at use modrewrites on my pages?
Google doesn't mind id, it's sid or sessionid that they'll get picky about. I'd tend to avoid id, but having parameters is ok for an example of a site doing well check out http://www.searchenginejournal.com/?p=3066
And even if you have problems with using SID etc. on your site, some *cough* sitemap generators can filter such stuff away