Hello I thought might something wrong in my robots txt its looks User-agent: * Disallow: Sitemap: http://www.domainname.com/sitemap.xml.gz do you guys think is this oky? but when I want to see the sitemap the zip file pop up for download
This looks good. This is a compressed file (shown by the .gz ending) so it will appear as a download that you must uncompress with GZip on Linux (or WinRAR on windows). Google can fully manage doing this, so your sitemap and robots.txt are ok. If you want to read it on the web, you will need to use an uncompressed file which does not use the .gz ending. Simply removing the .gz from the filename won't work. You will need to uncompress it first.
Pretty much you have it setup to allow everything (assuming the text you posted here IS what is in the robots file). It depends on what you are trying to accomplish. Many people make sure to have the site redirect either to "domain.com" or "www.domain.com" no matter which a person enters, another one people put in there are any bots that mess with the site or constant roving in their files. Another is for various directories to NOT index on their storage. Research robot files and you will see a million combinations, another to look at is "seo robots txt file" to see what others suggest as best for optimization and indexing purposes.
i think better you change your robot text make it new is very easy because dont waste your time with uncle google than you lose thousand dollars, better make sure you have fix it