Hi, My site shares user uploaded content, and currently has tens of thousands of pages. When user uploads some content, I'd like to let google index the new page as fast as it can. Is it a good idea to dynamicly generate a sitemap file and ping google? Or should I have an rss feed for google? If later, how can I let google know about my rss feed? Thanks, Erik
XML Sitemap... I would suggest using google webmaster central (google.com/webmaster) to submit your sitemap to, also place a link to your xml site on your homepage. Don't also forget to create a robots.txt file linked to your sitemap. You could also do the RSS feed, the more the merrier!
well both works... but continues link building can really helps.. just don't stop promoting your site/page and it will crawl regularly..
The site map will help google find all of your pages, and the rss feed can also help generate traffic. Be sure and submit the rss feed to sites like syndic8.
both Sitemaps and RSS feeds would help your site get indexed quickly while the most important factor is to get backlinks pointing to your site in the first place and you can declare the RSS feeds of your website using the meta tags like: <link rel='alternate' type='application/rss+xml' title='YOUR RSS TITLE' href='http://www.YOUR-DOMAIN.com/RSS_PATH/***>
thanks for the suggestions. Sounds like everyone's suggesting site map is a must-have. For a site with 10k-100k pages, will a site map file be too large? Panama and IEmailer, thanks for the tips on how to let search engines know my rss. I'll just use my rss for "latest posts" to let the engines know. very helpful tips! Erik
ic... that'd be good. Can I create only 1 sitemap file only for the latest 1000pages? This probably only contains 1% of my site pages. but, since google says site map files are supplementary to crawler, that should be ok, right? I mean only putting 1% of the site urls in sitemap will NOT screw up googlebot from crawling other 99% pages, right? One more question - how often should I notify google about "pages added/modified"?
I would just include all the pages in your sitemap, if you want to make things easier you can create multiple sitemaps for different sections. I have a strong feeling that multiple sitemaps might be directly related to sitelinks since a very large site would most likely use multiple sitemaps.
Submit it to google webmaster tools, and wait. They'll check back themselves quite frequently. If it changes every time, google automatically adjusts how frequently it checks. Oh yeah, and once again. Put it in your robots.txt file, so msn/yahoo/ask can find it as well.
I ran into the similar situation while I was creating a sitemap. Later I abandoned this idea. Like to know other options how people deal with this.
No can do. Google sitemaps have a limit of 1000 links per page I created multiple sitemaps that are all called from one main sitemap. All google format. This worked well and my new posts show up in Google news within 30 minutes of posting.
Now we got two opinions. Which one is true? Another question is, for sites where content is generated by users posts etc like forums, how would you deal with the sitemap?
Hmmm never tried this before, but I have an idea. Most forums have rss feeds for each thread. Hopefully even categories. Why not point the sitemap to the rss feeds?
Thats not a bad idea actually, centralizing all in one place. Another issue that arises is since users will submit content by minute on large sites like DP, how would we deal with the rss feed, making sure its representing up to the minute current content. Any idea?