Firstly I used User-agent: * Disllow: / Code (markup): But then I have changed it to (about 8 hours ago) User-agent: * Allow: / Code (markup): Now have no restrictions for bots in my robots.txt http://www.forumese.com/robots.txt But its saying "Denied by robots.txt" in webmasters account. and thats why my pages are not being indexed from the sitemap I submitted. Is it normal to take that much time for changes to take effect?
You have a major error! Do not put "/" after disallow. That says 'disallow all pages in the root folder'. Do not use allow. There is no such thing. It must appear as: User-agent: * Disallow: Code (markup): If nothing follows the disallow tag, all are allowed.
digitech is right, you have change your robots.txt 8 hours ago but it hasn't downloaded the file for 15 hours.