Dear All, is there a method to get crawler coming to the site more often? I am running an ecommerce site and having products added to the site every day (The updating of product wasn't reflected on homepage). However, it seems that the crawler simply didn't come to the site. Also, I have updated the meta title for homepage and subcategory pages for about 10 days. But the indexed pages are still showing the old meta title and description. What should I do? Thanks a lot!
Submit a sitemap via the webmaster tools section of the main search engines. Then make sure you update the lastmod timestamp and reflect changes, additions, deletions, etc. in the sitemap. The sitemap is re-crawled often and would most likely help you get the other pages indexed also (the newly added products).
Another thing to keep in mind, is that if you start adding content to a particular page on a regular basis, then the cache rate should improve. The more regular the updates the better the improvement should be.
submit your xml sitemap in google web master tool. also submit your site in google webmaster tool. robots.txt file in google webmaster tool submit User-agent: * Allow: Allow: /
Sitemap submission is the best method Second best is publishing articles constantly, this ensures the bots sees your url all around the net and atleast keep visiting your sites, of course when they come they expect fresh content too, so ensure that is their on your site too
Hi Buddy!!! If your the done on your site's content related to your targeted keywords, business and services is good for crawler can crawling more time on your site.. valuable....
Sleeping with the CEO of Google is actually a pretty effective method of getting your site crawled more often. I may have had to do some stuff I'm not proud of but I sure don't regret it He was very gentle lmao
I would suggest you to have an page RSS feed of products added daily to the website... Sumit that RSS feed to top RSS directories like feedburner ,bloglines.com n others...This will greatly enhance your search engine visibility...
Submit your xml sitemap in google webmaster. Do submit your site's links in sites which were crawled by the bots regularly.
Create a autoblog, either that or update your site content with Videos, newslinks, forum links & podcasts. Also tag any images with appropriate keywords (hint ADD IMAGES!)
Dear All, thanks a lot for the wonderful suggestions. I have kept a record of them and going to try one by one. Thanks!
If you agree to know that points and want to use that point one by one. Now time is that you should show your that site url bcoz i want to help you..
What a crazy answer!!! It looks like rewritten using a tool. People are becoming so lazy that they are just using rewriters to post replies too Thanks for sharing this link. Added to my collection. LMAO. There would be a que of SEOs to do the same if this was the situation Simple and well said.
Submitting sitemaps doesn't really help with crawl frequency at all... Google uses sitemap.xmls to assist them when THEY decide they are going to crawl your site. They use it primarily to figure out which pages are most important on your site... i.e. they use the <PRIORITY> to determine "If we only have X places in our index for example.com pages, then which X URLs would the webmaster prefer that we index first." But it doesn't make them crawl more often. The ONLY time a sitemap.xml really helps is: 1) With a new site that has little or better yet NO inbound links from other sites' pages which are already indexed. In other words, if there is no way for Google to discover your site naturally by crawling inbound links from other sites then a sitemap.xml can help you get indexed. But even then, it can take weeks sometimes to get crawled and a sitemap.xml STILL doesn't guarantee any of your pages will be indexed. AND 2) If you have a huge site with hundreds of thousands or millions of URLs, it can help the engines with deep crawling your site. But other than those two cases, sitemaps are really not that useful. If you want to increase how often Google and other engines crawl your site then I would recommend 2 methods: 1) Creating new pages on your site constantly... 2) Build more backlinks. Each site has a scheduled crawl rate where the engine visits your site every so often to update it's index. Sites are often put on a longer crawl cycle - like once per month (blogs are an exception) - when they are first indexed. If each time Google returns to your site they find new pages to be crawled, they will typically up your crawl frequency... maybe come back in 2 weeks instead of a month the next time. If they come back in 2 weeks and find a bunch of new pages, they will typically up your crawl rate again... maybe coming back in a week the next time. This continues until Google finds what they think is a good balance between your content generation rate and their crawl frequency. Not only does each site have a scheduled crawl rate for Google or other engines to visit and index your site on a regular basis, but they ALSO visit your site in between those scheduled crawls. They do this because they are following links to your site from other web sites while they are performing scheduled crawls on those other sites. So if Site A has a page that links to one of your URLs, when Google crawls Site A's URL, they follow all outbound links from that page to verify that the pages being linked to are still there, are returning 404 errors, are returning 301/302 redirected status, etc. so that they can update their link graph accordingingly. So when they request the page that Site A is linking to on your site, they will not only index the page on your site being linked to... but will often crawl a handful of pages on your site that your URL links to as well. So they do lots of small partial crawls of your site between your scheduled crawls because of following inbound links from other sites. The more links your have, the more of these intermitant, partial crawls that occur.