I have about 10 websites on a sharred server and I want to know how I would setup the robot.txt file for all of them. Do I have just on in the main root or do I need on in each folder for each website? What content should I use in them? Any and all help is welcomed, I am new to this so I need help in doing this the right way. Thanks
If they sites are subdirectories of one domain, you can just use one robots.txt file in the root directory. If they are different domains, you need to have one for each domain. Why exactly do you think you need a robots.txt file? You don't need it unless you are trying to restrict certain robots or restrict directories from being spidered.
I have never used a robots.txt file. They may be useful for some things but I've never found a need for them.
Ok I have the main site and subdomains under the main so I do not want the spiders to spider the subfolders. I do not show up in Google or Aol yet and not sure why I have submitted my website domain to search engines and it has been over 10 months since then and nothing. Any good tricks or tips in getting listed and showing up high in certian keywords searches? I have read alot of articles but I am still not getting any results.
Google indexes pages when it finds links to them., if you don't link to the subfolders it won't spider them. Why not just redirect the subfolders to the subdomains? What keywords are you trying to rank for? Whats the url?
ok so why is it google has not found my website yet? I am trying to get a high ranking for keywords of Miami fine hotel. I do have subdomains going to sub folders now.