Hi, I have a very serious question about SEO and Hoting account. I have a hosting account by HostMonster. Everyone knows that you can have many websites hosted in only one hosting account. I am doing it too. I have in my public_html many different folders and every website has own folder. I installed my main website in the public_html. That means when Google is crawling my main website it is indexing automatically all other folders too. My question is if Google Robots understand my websites are duplicated? I am asking it because my PR dropt down from 5 to 0 and I know I didn’t sell any links and I have done exactly the same SEO like before. I appreciate for any suggestions.
Google will not automatically index the other sites/folders unless there is a link pointing to them. Anyway you will not lose pagerank because of duplicate content.
Actually Page Rank is not important to me because I have good traffic but I was just wondering why my PR dropped down so badly. It is nice to have good traffic and good PR but unfortunately I lost my good PR.
Unless each one of your folders contain the exact/close to the same information as your main folder it's not duplicate content. And if you are running so many sites, using the same subject and information, it will hurt you and you should think about consolidating it all into one domain
No my websites are all different. But every website is in a folder and these folders are in the main folder (public_html). You know when you have a hosting account like that, then you have to install a main domain in the public_html and all other websites will get their own folder under public_html. I am only concerned about this for example: Main domain is in the public_html and domain is www.domain1.com Folder domain 2 contains www.domain2.com Folder domain 3 contains www.domain3.com I believe that Google Robots are indexing for example www.domain2.com and they are indexing “folder domain 2†under www.domain1.com too. That means Google Robots are thinking that www.domain2.com is the exact copy of information under “folder domain 2†because this folder is under the main folder public_html. I think that Google robots have penalized my websites because they believe that my main domain www.domain1.com has exact the same information from www.domain2.com , www.domain3.com and etc. I hope I could explain it very well with my little knowledge. I appreciate for any other suggestion.
They shouldn't be doing that, but if they are, just block the folders from your robots.txt file. Also make sure you have 301 redirects in place from the folders to the add-on domain names. That should stop this dead in its tracks.