My site is protected by a robots.txt file, which does not allow directory submissions agent to access my site, will it affect link building? What can be done?
What exactly do you mean? IMO your post doesn't make any sense at all at the moment. What has blocking search spiders to access one page to do with link building?
Kind of. They will crawl your website as they would normally do, but won't crawl the pages you have disallowed
If you have disallowed a page of yours into. for example, for directory agents that you are talking about. Then your link building will not include directory submission and may give you a limit on your link building for a site. Robots.txt helps crawl your site and index them.
its useful for your website indexing.when bot enter in your website if robot.txt file include so that bot read your robot.txt file and then bot index your page which available on that file. This allows all robots to crawl all files. User-agent: * Disallow:
robot.txt is very useful to enable the crawler to access the webpages and its data. if you disable the crawler by robot.txt it will not cache the webpage for seo purpose you should always enable the web crawlers.