I developed a bloger blog that was moving quickly up the rankings for a specific keyword. I was doing some research and followed the instructions on this site and edited my Cutom Robot.txt and Cutom Robot tags. This morning I woke up to a rankings drop of around a page and noticed that 63 of my URLS have been blocked from being crawled? I am new to this and am unaware of what is going on? Should I go back and disable custom robots.txt and custom robot header tags? I would appreciate any and all advice.
It might be your content. If google sees something fishy (especially on their own site), they might start deindexing your contents. Other than that, it might be doing a google dance and rankings will fluxuate. Keep calm and optimize
I think that you blocked your your pages yourself with Custom Robots.Txt If you are using blogger, you don't need this. Custom robots.txt is a way for you to instruct the search engine that you don’t want it to crawl certain pages of your blog.