how can i get all pages of my website indexed by google, yahoo, MSN are there people actually submitting every page of a website. I do have pages indexed is there a check that can be run to test this?
<META NAME="robots" CONTENT="index,follow"> on all pages... site map/good linking structure. dont make an insane number of deep folders. google has a xml site map submission...i didnt bother as it seemed like a hassel.. abc.com/foldera/folderb/folderc/folderd/webpage.html i dont know exactly how many deep you can go---but i try to keep everything one deep.
Use Google Analytics - it gives a lot of features for you. (ROAR is right about the Google Sitemap Generator - it is not too user friendly) And of course, prepare a SiteMap. One option for this is http://www.auditmypc.com/free-sitemap-generator.asp (not mine! ). Search google for "sitemap generator" for more options. Do the basics like linking your pages to each other extensively. Best wishes, Sujith
the <meta name="robots" content="index,follow"> does nothing for the search engines, the robots meta tag is used to ban search engines from individual pages, like the robots.txt The best way is to use clear, simple navigation that breadcrumbs, and don't have too many levels of navigation
Search engines should crawl at least two deep on your site as long as if you don't have any more than 100 links per page. Assuming that you don't want to water down the value of your links and don't want to take a chance then I'd suggest that you keep it to 50 links per page. If you do this then that'll cover 2500 pages.
gypsy maybe--- i have lived remotely for a few years off it though. not saying its the best way---not that pretentious. it got me to #1 for Viagra sales. ---KIDDING. I can do whatever I want, whenever I want, wherever I want. Thats good enough for me. But I agree...there is probably a much better way. Cheers.
Hey there, you only have to get your main page indexed, then use anchor text and the like to link to your deeper site pages. They will then automatically get indexed by the Search Engines. If you submit them all to the search engines at the beginning, it may seem like it did a good job. But it could be hurting your ranks. If you let it go auto your guaranteed no penalties.
Index and follow. Google site map might get some pages listed faster. 50 links per page no more. 2 clicks from front page no more. What works for me.
I have a "Recent..." links box near the bottom of my often-crawled home page (last link in my signature). I place an anchor-text-rich link to every new inner page in that home-page location at least until Google indexes it, which is usually pretty quickly. If your home page lends itself to that kind of format, it's a quick way to get pages indexed.
As I said before, you don't need to submit the pages to the SE's, but using a site map as well as anchor text in your body content is a great way to get every page indexed and ranked in different terms you want.
I have a huge site for which: - I've never submitted any page to search engines. - I'm not using META tags for "index", "follow" or simmilar (only file robots.txt to exclude certain directories to be crawled). - I don't have any sitemap because is complicated to mantain updated on huge sites. However, my site is frecuently crawled by google, yahoo, ask, etc. (currently yahoo has indexed +350,000 pages for that site.) All needed (at least for me) is a good structure with several levels, original content and deep linking to my inner pages from other sites.
You can create a simple list of all your pages, like: http://mysite.com/index.html http://mysite.com/specials.html http://mysite.com/order-page.html http://mysite.com/privacy-policy.html and save it as a txt file with any name you choose, maybe sitemap.txt. Upload that text file to your site. Then submit its URL (http://mysite.com/sitemap.txt) at http://www.google.com/webmasters/sitemaps and it should get your pages into G's index pretty quickly.
Swooop- That doesn't really work. You can do that all day long, and google will never index past your home page. And I have the data from 50 websites submitted to prove it. At least for new domains. Older domains act a little differently. But, for a PR0, new domain with just a handful of incoming links, it's VERY hard to get anything indexed beyond the home page. Just my personal experience. MR
Even if that did work... I wouldn't submit anything to Google now. At least not till some things are cleared up. I read somewhere that sites in googles system will not be able to avoid the results which they are going to pull from each person. Could just be a big bunch of nonsense but I would rather be safe then sorry. So I just let the Search Engines find my stuff organically. It yields better results anyhow.
Could never send a sitemap if blogger.com is hosting it - i'd be interested to know if anyone on blogger (blogspot) has had any success at all?
I submitted something similar to this in the directory section today. That thread was directed towards directories, not web sites in general so you might have to tweak it a wee bit to get it working for your site. I saw this thread php up in the "fun stuff >> spy" section (BTW: great add on Shawn) and figured I would point you guys out in that direction, hope it helps someone. The gist of the thread is that I put together a quick and dirty tutorial on how to get your directory fully indexed. It's all about your xml feed. If your site is dynamic, which most sites > 100 pages should be, then you can query the db and build an rss / xml file without more than about 10 minutes worth of work. For bots, rss feeds are like an all you can eat buffet, loads of yummy content and links to follow. This is very important for directories as what good is it to list your site in a directory if google can't even find the category page your link is on.