Due to crawling issues where Googlebot can’t access one of our sites that’s why I wanted to share this scope, since experiences can be the best way to learn. To know more about its definition here’s Google answers : access denied errors. This kind of issues must be noted and give solution immediately since it can lead to a serious effect on your website if you are going to take it for granted. What I did? • Read the notification and look at the overall error rate for DNS queries of the site. Is it 100% or below? As you see the site has an overall error rate of 33%, so what’s next? • Read the recommended action for your error rate. Since according to Google webmaster on their post about from crawl error alerts, the reason are the following:  Either down or misconfiguration in DNS server  Firewall off on your web server  Refusing connection of your server from Gbot  Server is either down or overloaded  Inaccessible site’s robot.txt Since it’s less than 100%, I tried to consider the possibilities of checking the web server if it’s down or overloaded. Where in one of the recommended action to take is to contact the hosting provider and discuss or ask about this configuration issues in order that you can also talk about the possibility of allocation of more resources for the DNS service. • Double check the real cause or do further investigation. In my case it was an overloading issue. What if it’s a hundred percent you’ll just need to follow the recommendations given by Google just see the above image or simply do the following:  Checking of the DNS setting  Check if your hosting account is verified or active • Do possible actions immediately. Before I contact the hosting service, I started to remove or lessen those insignificant plug-ins and test if can solve the issue but if not I will immediately connect with the hosting provider. • Check if the solution found is effective through using Fetch as Google to verify if Googlebot can now access you’re site. Luckily with just removing those insignificant plug-ins and check again the webmaster tool to fetch as Google the website can now be accessed by Gbot. But if you if you think that that plug-ins are still useful for you all you need to do is contact your hosting provider in order that you can talk about allocation of more resources for DNS.