I've noticed that a lot of sites I've recently evaluated do not have the XML and HTML sitemaps referenced with the robots.txt file. Including sitemaps in the robots.txt is known to be good for SEO and helps GoogleBot with crawlability. How important is it to you that your robots.txt file includes sitemap references?
I always thought that Google will crawl your site (and sitemap) whether you have a Google webmaster account or not, or whether you include sitemaps in your robots.txt file or not. As long as you don't block Googlebot it will crawl your site one way or another. Maybe including sitemaps in your robots.txt expedites the process, but I don't see how it's necessarily good or beneficial for SEO in the long run.
The only time sitemaps are really needed is for very complex sites where the indexing bots might not find important pages in a timely manner. I never even considered a sitemap for my site until I had over 9,000 pages and NEVER had a problem being indexed. If maintaining sitemaps wasn't so easy, I would probably not have a sitemap on my site today, even though I currently have over 13,000 pages. Comparing before and after, I see NO DIFFERENCE in how my site is indexed nor where it lands in the SERPS.
Sitemaps are pretty easy to create with most CMS so I'd make one and register it but I suspect for an actively maintained, well structured site that it's overkill. Adding it to Robots seems like trying to double dip - the file is really to tell Robots what NOT to do, rather than point them towards content.
It is a beneficial factor to include your sitemap in robots.txt file. You don't need to make much efforts in that, Just copy and paste the .xml sitemap link in your robots.txt file. The benefit of it is that when bots go through your robots file it also go through your sitemap which has a good effect on your crawling
Plenty of people out there do seem to think it's worth the trouble learning good syntax and because I'm weird I enjoy getting it right. Make friends with the bots and they will reward you. I have a robots.txt syntax question. If I write the following Disallow: /*.pdf$ Will it apply throughout the site including directories or should I repeat the code for every directory level. Thanks.
It is not necessary to have a map site in robots.txt, but it's better than not having it It is important that the site map is optimized Use this plugin to create an optimal sitemap wordpress.org/plugins/google-sitemap-generator