I have a new website i set up and only about half the pages are currently indexed. My site has been crawled multiple times since they have been added and all pages are linked on main page. How do i get these pages indexed?
Have you build a internal linking structure for your site? I.e. one page linking to another and so on in a chain? Thats one of the best ways to get all of your pages indexed by search engines, just be patient and you will see the results.
Step 1. Create an XML Sitemap. Step 2. Submit your Sitemap to Google: http://www.google.com/webmasters/ Step 3. Get backlinks to those pages if you can. Step 4. Your site's pages should now appear in Google.
I have done sitemap long ago. Im knowledgeable with the stuff you can do. Ive never had this problem before. According to webmaster tools my site has been crawled 3-4 times since i added and linked the pages. I thought maybe there was something else out there. Im just a little antsy becuase i know once those pages get indexed my search traffic is going to skyrocket. Im already getting 50-100 uniques from google a day and not even a 3rd of the pages are indexed.
not to get chewed out by people, just submit them. 3 a day/per domain until they're all submitted. I have an A Through Z project, which consists of the main page and an a.html, b.html, c.html, etc. Currently google only has the index, a, and b. So what'll I do, is individually submit each page(3 a day/per domain, so I don't get in shit for spamming). So I'll start with c, d, e, etc You get the idea.
Does your robots.txt file reference the XML sitemap? Not that it would matter in your situation, but it can't hurt. How long have these pages been Online? Do these pages have dynamic content? Are these pages possibly being dropped for duplicate content? I'm assuming the coding on these pages are good and contain no broken links.
May be Google does not want to index other pages. May be they ran out of hard drive space. Who really knows.
If your web site is new, there's a long delay between when Googlebot scrapes your site and when the pages actually show up in the index. As others have said, create an XML sitemap (or have one created for you at www.xml-sitemaps.com) but I also recommend creating an HTML sitemap too. Make a small link to the HTML sitemap on every page and make sure the HTML sitemap has a link to every other (inner) page on your site, structured so that it's human-friendly as well as search engine-friendly. Add the XML sitemap to your robots.txt, submit it to Google and Yahoo, build backlinks, and keep checking every week. Don't get into the habit of checking it daily because it will be a huge disappointment and less exciting than watching grass grow. Instead, keep adding content and updating your sitemaps - and resubmit your sitemap after adding new pages because this will help Google and Yahoo learn that your web site is updated frequently. I'm not 100% sure about that, but I don't think it could be proven either way and I don't think it will hurt you.