Hi I have both the www and non www versions of my site in Google webmaster tools. I am using a 301 to redirect non www to www version. I have recently (a day ago) updated my robots file and noticed errors with the disallow commands.....I then noticed that if I go into the non www version of the site in webmaster tools it is still showing the OLD robots file..... I think this is causing the issues. Sounds confusing I know but hard to explain the situation. Any help or advice much appreciated!
Let me guess... you're using a content management system such as WordPress that automatically generates a robots.txt file, right? (If not, your server sucks.)
Not at all - robots file created manually and is perfectly clean and structured to syntax. I dont think the issue is server related. What did you have in mind that would be causing problems like this?