Hi all, I'm currently building a site for a friend. Whenever I build a site, I normally I use the robots.txt file to block the search engines at the root level while the site is under construction... then I lift it when it's finished (or nearly finished). My friend's domain has a pre-existing PR3 (I don't know why). So my question is, while I build the site should I still use the robots.txt to restrict access? Or will that harm the search engine's current opinion of the site? Thanks in advance!
If you are building a site it should not be available online. No one likes to visit pages "under construction" or pages that look horrible or pages that do nothing for the viewer. People visiting pages like that leave and don't come back. Once pages are available online, you should not use restrictions on pages that you want indexed. Google typically indexes on a 3-6 month cycle so if there are restrictions on available pages, it may delay indexing for months.
Dont use robots.txt to restrict access It will be harmfull to wait again for them to come and follow new rules / if they would come again.