I have submitted my sitemap sometime ago, but it seems like when I use Google webmasters and check the health of my website, http://www.alvinweightlossguru.com/ I notice that half of the total indexed URLs are like blocked or not selected! I am not using any robots.txt, nor am i doing any blackhat seo, why's this happening to me? I have been trying to figure out for several weeks but to no avail. I still am able to rank on page one for some low competition keywords though.
You could consider doing noindex on your category/archive pages so your sitemaps mainly contain unique content URLs By the way, there is no "allow" instruction in robots.txt (your file contains that)
Sorry, how do I get rid of all those problems? I am not well versed with robots.txt any help will be greatly appreciated!
you can use robots.txt to block certain unwanted pages from getting indexed. Just google for robots.txt sample files and you will get lot of help.