The easiest way is to go to Google, type your site URL, then click on "Cached". It will tell you with plain details the date, and time your site was lasted crawl. Also, you'll see a snapshot of the page as it appeared to the bots...
Just tried that and I get a message that says "we are sorry...but your computer or network may be sending automated queries. To protect our users, we can't process you request right now." Do you guys know what causes it and is there a way around it?
It happens when you are networks connected to lots of computers and they are all sending queries to the google search engine.So to avoid this try to use proxy.
Any ideas on how to improve PR of internal pages? I got PR3 on the home page, and some of the internal ones have a 2, but some don't have any page rank. What gives?
Hello Friend, There is a two option to first you can check through the Google tool bar, and second you can check through the commend cache . If you will put the commend of cache then you can check that when your page has cached.
You have to go through the index page of websites, you will see the date and catch time of google crawler visited in your websites last time.