Hi, I had More Than 1 Lakh Urls, Created Multiple Sitemap Xml, first Sitemap xml is working Well, 2nd once Not working Properly, can I Keep sitemap xmls in Robots.txt? Please Help me.. thanks regards Rakseo
XML sitemaps have nothing to do with Robots.txt. And yes, you can have multiple sitemaps. Administrate them in Google Webmaster Tools.
hello first a fell if you want make a big sitemap.xml then you will create self and second robots.txt file not use anywhere in sitemap.xml.. but be careful create..
XML sitemaps DO have something to do w/ robots.txt. Unless you want to manually admininster them w/ every search engine on the planet, it's good practice to include a sitemap: directive in your robots.txt pointing to the sitemap.xml (or sitemap index file if you have multiple sitemap.xml files) on your server. This way ALL search engines know where to find your map without having to manually administer it through interfaces like Google WMT at each and every search engine. And to answer the OPs question... 1) create a sitemap index file 2) use the sitemap: directive in your robots.txt to point to your sitemap index file and the engines will follow the sitemap index file to discover your remaining sitemap.xmls
The only reason you should need multiple sitemap.xml files is if you have more than 50,000 URLs that you want to submit. If so then create an index sitemap file that lists all of the individual sitemap files. No need for multiple webmaster accounts. If you don't want to deal w/ an index sitemap xml then you can include multple sitemap: directives in your robots.txt pointing to the individual sitemap.xml files.