Hi, I have a website, which show me errors in google webmaster http://www.domain.com/http://www.domain.com pages errors. The directory show me entire site in google webmaster. I have checked my all backlinks and root folder in FTP and cpanel but I can not find such errors. Finally I have decided to block them by robots.txt I want to know the effect from to disallow: /http://www.domain.com/ Will it block my entire site to index? All suggestions are welcome. Thanks
the command disallow in robots works like this, Disallow: / ;means block entire domain, if google shows above error then, there is something wrong in the coding part, it may be case that, google can reach your home page from home page and that is not found. eg: in home page domain dot com, there may be some link that says: domain dot com / againyourdomain dot com (i can't post link so i wrote like this)when google crawl this page, this page generally error or not found. Check carefully your site code, don't go for robots file.Hope this helps.
I have checked my site and coding 2 to 3 time. But not find such things, I want to know the effect Disallow: /http://www.domainname.com/ Thanks
You can check this in Robots file checker tool in your Google webmaster account and it will show you if this will block your entire site of just that page.
I have checked it. It shows me as a directory. But when I check my entire website like http://www.domain.com, It also shows me as a directory. Still I am confused. My site have good traffic and sales and do not want to deindex my site from Googlebot.