Ok guys, my site carcommons.com has about 39K pages indexed give or take. I've been stuck at this amount for a while now. I'm point a large amount of coop weight to it and some of its some pages. Done about a 100 directory sumbissions and have a few link exchanges. My Goal is 100k pages index because ultimatley ever time I get more posts indexed traffic raises. This should be very possible since I have 60,000+ posts 14k+ members and a book store to go along with it. What would be your tip in helping me to get to my goal of 100k pages indexed?
seriously mitz sitemap is the win I have site that had 58k indexed pages. I have been doing daily sitemaps to google now for about a month and have been growing steady. up to almost 200k now its really nice.
Yes sirs, http://carcommons.com/sitemap.php It's quite large though ans takes a while to load. Should I look into reducing its size? Any other tips. I heard Vbulletin forums are indexed much much faster and better then PHPBB. Would it be worht switching or on my next prject should I use Vbulletin (I hate license fees)
I am in the process of moving a 180k user forum from phpbb to vb... its pretty easy with there importing. As long as your map is under 10mb your fine =) depending on url length this can be tricky... i have to split mine up so it does 5 seperate 10mb files.
to vb? I will let you know when its over for you I would do it in a minute. I have tons of custom stuff with phpbb2/postnuke that is going to be very time consuming to rewrite for vb. vb however is fricking so much nicer =P
do you mean the license for vb? its so cheap i dont really consider it but I imagine it would carry over. your current urls would not work and you would have to 301 back to the primary until your new stuff was indexed... are you getting alot of se traffic now?
hi shoemoney, which tool are you using to generate the sitemap ? there are many tools available but i have seen they create errors while dealing with more than 20 k pages on a website. any ideas ? thanks
Virginia SEO was referring to Google SiteMaps , not an actual sitemap on your site - they are two totally different things. Google SiteMaps is new BETA technology offerend by Google, where you can submit every page of your site for indexing. Also regards your page size. If that page is over 100K then chances are Google won't read it all.
I wrote my own but if I had the site that mizt has I would probably use one of the log file analyzers. I took one look at most of the tools out there and knew I wouldnt be able to use a cookie cutter solution. It was alot easier for me to generate urls from my database and format them to the google xml spec.
I don't have a link to at hand (I can't even remember the program name atm), but I've used command line programs to generate text file sitemaps - basically just a list of urls - which you then run the google scripts against to make the google sitemaps. It works with a 50K site, the site was brand new and G is indexing it a lot quicker than I've ever had a site indexed before. You still seem to need a sizeable number of IBL's to persuade G to take notice, but once it does it seems to just keep on going I've added a few hundred other pages that aren't in the G sitemap and funnily enough, not a single one of those newer pages have been indexed - so it might be a good idea to keep the G sitemap upto date if you're going to use one.