my site has about 3 weeks old and of course PR0. google indexed it rather quickly but first only one page, then added 4. and so on. I was very happy when google indexed almost all the news from 03 Dec 2005 (my site is a news site and is changing daily) -> so i have 155 pages in google's index. But from that day google didn't index any of my news at all! why is this happening? i have a google sitemap and i don't know what to do to get indexed faster.
Just a couple things that come to mind, update your index page and add more incoming links, check for broken links, etc...patience is also key
You can't expect everything within 3 weeks of coming online. Google always hits hard at first, then they slow down, but they'll eventually come back and often.
And what is the reason for gooblebot searching through all my site?!?! for example, today it did 145Mb trafic with 1450 hits (as my trafic analyser shows). My index page is changing in a daily basis - daily news and googlebot comes every 10-12 hours to visit all my site. i have another "problem": i have 2 sitemaps for google: one is generated in an .asp and one is pure .xml ; the reason for this is because my site is changing every day ; BUT, the automated sitemap is created with links like this : www.editiaspeciala.com/default.asp?newsid=-1&categorieid=9 and the second one is with link like this: http://www.editiaspeciala.com/news/...niestria--confrontation-of-two-clans-1766.asp The second one is not updated every day. Could this be a problem for my not being indexed? PS: sorry for my poor explaining
1. google sitemap of coourse you have submitted your sitemap to google on your google stiemap accounts page and you check if that sitemap is accepted/validated ( I just recently had 3 days a problem with Google sitemap bot rejecting my robots.txt that otherwise was accepted by normal Googlebots ..) hence i validated/corrected my robots.txt to please google sitemap bot as well. now all is back to normal ) 2. if you wish google to come daily and visit all changing pages daily then of course you also would want to update your sitemap daily and have that frequency properly displayed in your sitemaps <changefreq>daily</changefreq> 3. line 1372 »</font [ you may want to close that tag properly with a ">" rather than a "[" !! a validator check for yor entire page may assure that your entire page is bot freindly as well: http://validator.w3.org/ always helps to assure all text can be processed properly by bots 4. you may want to submit your RSS feeds directly to the Google news service and of course also to Yahoo news
1. for Yahoo RSS submission you may ee my recent blog post on Yahoo's power in RSS and Blog technology or go direct to Yahoo! Publisher's Guide to RSS as mentioned for Google below - Yahoo also makes a difference between private RSS feeds and actual NEWS feeds. i have 3 RSS feeds and only one is in the Yahoo NEWS. 2 Google RSS and other webmaster help see Google Information for Webmasters for RSS submission to Google see http://www.google.com/feedfetcher.html and about Google NEWS and to submit your RSS fNEWS to Google then proceed with "I want to recommend a news source" this Google RSS feed submission is NOT for private RSS feeds but for NEWS feeds - as it may in your case of an entire site being devoted to news from a particular country. 3. about RSS submission in general - may be my RSS Howto may help to promote your feeds. or you may fine more resoucres and help searching Google for RSS howto 4. for both a.m. RSS NEWS submission - it may take up to a week or 2 to have your feed submission processed. once accepted your Yahoofeedseeker may be on your site's feeds several times per hour ... OR any time within minutes upon request by ping Good luck !
thank you guys for your support! i'll try to validate my site, the think is that i don't know what to do with flash sources - this is mainly what is not validated. is there a way to validate flash?
i've done that! i have 2 sitemaps for my site but google can not verify my sitemaps because of an internal error i don't know why!??!
YOU have to validate your sitemaps of course and correct the sitemaps to zero errors - else Google rejects the sitemap ! RSS feeds and Google sitemaps are FAR more delicate to get read by the bots than regular content. while regular content SOMETIMES can handle a bunch of errors - depending on precise kind of code errors, the sitemaps allows for zero errors else Google will reject sitemaps, same applies to RSS feeds Linux users would use the tool xmllint with following syntax for offline validation: xmllint --schema /path_to_your_offline_file/DTD/sitemap.xsd /path_to_your_offline_google_sitemap/sitemap.xml xmllint does NOT validate compressed sitemaps - only the uncompressed version if you are WIN - user you may have to Google- or Y-search for an online validator specially for Google sitemaps OR you use a tool that always creates clean validated sitemaps you may try some of the Google recommended tools https://www.google.com/webmasters/sitemaps/docs/en/protocol.html you also have your sitemap verified - verified is different from validated !! https://www.google.com/webmasters/sitemaps/docs/en/verify.html Google about validating sitemaps https://www.google.com/webmasters/sitemaps/docs/en/protocol.html TIP of the day: if you make a text format sitemap for Google, then you also can submit THAT text format to Yahoo as Y only accepts text format sitemaps - one URL per line and NO Google xml format sitemaps!! if you use xml sitemap format for G then you may have to create a second sitemap just for Y see Google sitemap submission to Yahoo here you may find various online validators - I never found a good sitemap validator so far but had no need to search since xmllint works perfect for me.