I submitted my sitemap to google.com/webmaster, i he said that i have 13.000 pages, but why i try search: site:www.mysite.com in google.com then have 4000 pages indexed only? My sitemap submit for 3 weeks ago!
you mean you have sitemap installed already or asking how to do it ? If it is the first one then you can use wp plugin for it. Just search for it.
I have sitemap installed successfully in Google Webmaster with 13.000 links But i have 4.000 links indexed in google.com when i try search: site:www.mysite.com
your english is really tough be patient with the engine saturation for your site, probably you have lots of pages that are the similar inbetween them or something like that... google wont give same importtance to all of your pages, you can suggest him the importance a page deserve but that means nothing if te page worth nothing. Thats my opinion
Please review my Google XML Sitemap FAQ on DP... http://forums.digitalpoint.com/showthread.php?t=531765
Not all of the pages will necessarily be recognized as a crawlable, worthy page by Google. Template pages, pages blocked by htaccess or robots.txt, and others may not be included, even though they are in your sitemap. Also, it can take a while to crawl. Take a look at how many pages are crawled per day in Webmaster Tools and you'll realize that it could take quite a while.
yes, just be patient...sometimes Google's webcrawler will only visit a site once or twice a month depending on how frequently you update it? Tip: 1. Make sure you have links to the pages that arent being indexed (preferrably on the front page). If the spider can't get to them, it surely won't be able index them. Add some fresh content to your page, that way the next time the spider comes back it will have something fresh to read.
On that note; Google's crawl rate also adjusts with your servers responsiveness. If they detect that your server is lagging they back off and crawl at a slower rate. If you have a large site, that can result in it taking a long time to crawl your entire site and infrequent crawling results. So having a server and bandwidth that can remain responsive under heavy traffic loads will improve crawling frequency for large sites.
4000 pages out of 13000 in 3 weeks is not that bad. You should monitor within Google Webmaster how the number of indexed pages increase by each week.
sitemap is only a list your web pages. indexing is related the power of your backlinks. i suggest you to find more backlinks to increase the indexing speed
I known it, and when submitted to digg, you may get indexed after 2 minutes But my URL banned from digg
i dont mean making spam to the social bookmarking sites or the other sites. try to find real backlink visiting the dp link development forums can be a good start