Hi all, I thought I saw an article somewhere that if you want Google to crawl the area of you site where you only open it to your registered members, you can create a user ID and password and submit to Google so that it can use that User ID and password to crawl the registered area. But now I can't find any information about this again. Is this true and where can I get more information about this? Thank you guys!
Good question, all I know is that if you /Allow the bots int he robots.txt file then the googlebot should be able to index the content. But I havent heard of submitting userid and password so google can crawl like a registered user. May be someone else can clarify the point.
Thats depend upon the site. Some sites allowing crawl to their registered members and some doesn't allow. They will restrict through "nofollow" attributes or robots.txt.
Never heard of this. but you can do one thing for indexing registered area by search engine boats. just create java script like that search engine boats can see the content but users cant see with out logged in. in this way you can able to index registered area.
I remember seeing somewhere that you can provide google with a username and password for restricted area, I'm not sure. It might have been for adsense. Check Google Webmasters, they should have the answer you need.
You need to have an account in Google Webmasters. You can provide the account details through it, so Google spider can crawl the pages in registered area. I hope that helps.
That is exactly correct. I have done it with a few of my sites and it works perfectly. Also google webmaster tools explains exactly what you need to do to make this work.
This is what I heard first time, hwever, I do not believe it. Can you let me know the URL where you find this information?
you know what after second thought I think it was actually google analytics that allowed you to add password protected pages so you could see the traffic info for those pages. I will look further into it and report back.
Google(and other search engines as well) can not crawl a password protected area on a website unless we are providing them a separate pathway to enter into protected area on the basis of crawlers IP. This could be done by offering them a new page that would be different from the actual page. This whole process is called Cloaking and Google observe this technique as a black hat technique.