Hi everyone, I'm putting together a website for a client and was going to upload the files to a subdirectory on their web host's server as I finish parts of it. I wondered if I do this, will search engines come and index the site even though it's still under construction and will eventually be put in another directory? Any advice appreciated.
Yes, if googlebot comes to your server, most hopefully it will eat your newly website even though its under construction. The best is to stop the googlebot to not index under construction pages to block that specific directory via robots.txt In robots.txt just add a line like this. User-agent: * Disallow: /directoryName Disallow: /fileName And googlebot or any crawler will not be able to eat your pages under the specified directory.
yes it will be index fast , if you main site is already indexed, if you want to stop search engines then create robot.txt file on your main directory. and write this User-agent: * Disallow: /directory name See you.
i'm not really sure what you are trying to do. are you trying to use a subdirectory so that google bot doesn;t find your site or you want to know whether you put in in the subdirectory whether you can still get indexed? if you want to get your site indexedm then in the main folder. but if you want G bot not to find your site till its done, then puting it in another directory will prevent G bot to find your site and you can finish building your site then announce