I am trying to create a sitemap for www.ultimatetickethub.com but it is not counting all of my pages. It onlt creates it for about 130 pages and my site so far has over 300 uploaded. What is wrong?
I was using a tool. See what is happening is It is indexing a page like this. www.ultimatetickethub.com/concerts/dave_matthews_band_tickets.php but not this one www.ultimatetickethub.com/concerts/dave_matthews_band_tampa_tickets.php the 2nd of the two is linked through the main dave matthews band page. Won't google bots end up seeing this page anyways when it crawls the main DMB page?
One thing I quickly noticed was that you are mixing www and non-www links. Even though e.g. A1SG can be configured to alias both as "root" (which seems to solve the problem when scanning, I got 480 pages), it is better to fix such problems (!)
Are you using a free generator or are you just weaving everything togther? I would reccommend the google xml sitemap creator. Just type in google sitemap in google search and there is one big link with a sublink. click on that and you got yourself a good sitemap in 10 minutes.
Download Xenu software which is free version easily available on net. Xenu crawls all site URL and related files also.