This is the situation, About every 15 days one of my websites change from 1k impressions a day to 150 impressions. 15 days latter ir get back to 1k and so on... Today I just got back to 1k and I was wondering if I could use /robots.txt to disable the robots and spiders and hopefully never get back to the 150 imp/day Is this going to work or I will be removed from google search if I do that? Thanks for helping.
because next time they visite my website it will most probably de-index and I will lose alot of page impressions...
This is non-sense and doesn't hold any logic in it, or at least I don't see the logic behind it. Let your robot.txt alone and work on your content, google's fluctuations are usual.
The Best Example of a robots.txt is User-Agent: * Allow: / Sitemap: http://www.YourDomain.com/sitemap.xml.gz Hope it Helps
you definitely do not want to get rid of the robot.txt this is how the search engines find your page. no matter what you do, it'll just hurt you to delete that.