Is Google cleaning their index?

Discussion in 'Google' started by LaCabra, Apr 12, 2006.

  1. Roman

    Roman Buffalo Tamer™

    Messages:
    6,217
    Likes Received:
    592
    Best Answers:
    0
    Trophy Points:
    310
    #161
    I'm blessed, the new site I launched 3 weeks ago has the home page indexed, nothing more, but nothing less and as Edgar Allen Poe said "Never more":)
     
    Roman, May 5, 2006 IP
  2. dudy255

    dudy255 Peon

    Messages:
    689
    Likes Received:
    105
    Best Answers:
    0
    Trophy Points:
    0
    #162
    ya this happening to my site my one site which has around 50 pages indexed now has only 1 page indexed but my new site got its first page indexed.
     
    dudy255, May 6, 2006 IP
  3. wisam74us

    wisam74us Well-Known Member

    Messages:
    1,059
    Likes Received:
    47
    Best Answers:
    0
    Trophy Points:
    168
    #163
    Many of my sites affected in a horrible way like from 50 k to 10 , but need to ask something about the difference between :
    site:www.domain.com (I get 50 k)
    and
    site:domain.com (I get 10)

    should they get the same number of pages
     
    wisam74us, May 6, 2006 IP
  4. wisam74us

    wisam74us Well-Known Member

    Messages:
    1,059
    Likes Received:
    47
    Best Answers:
    0
    Trophy Points:
    168
    #164
    wisam74us, May 6, 2006 IP
  5. wisam74us

    wisam74us Well-Known Member

    Messages:
    1,059
    Likes Received:
    47
    Best Answers:
    0
    Trophy Points:
    168
    #165
    I get a lot of headache monitoring google s data centers and see different results, but I can t help it .. finally I get something really funny .. when trying to track the indexed pages of a very new website (1 month) and see two resulats for the homepage, when I check the cached pages .. the second one was very strange and belong to different website (parked domain):
    site:artificialdiamonds.net
     
    wisam74us, May 6, 2006 IP
  6. Spendlessly

    Spendlessly Peon

    Messages:
    129
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    0
    #166
    The sites in my network that were hurt the most were the ones that haven't been indexed for longer than 2-3 months. Many were reduced to home page only, some to 3-4 on a site:domain.com search with the show omitted results indicator. They weren't supplemental... nor too similar to other pages of the site. I've learned to just roll with the punches on these things... keep building. You can get stuck DC watching for months if you aren't careful. Just focus on the big prize, nose to the grindestone folks.
     
    Spendlessly, May 7, 2006 IP
  7. smokey99

    smokey99 Well-Known Member

    Messages:
    475
    Likes Received:
    11
    Best Answers:
    0
    Trophy Points:
    108
    #167
    I have 10 small sites, ranging from 10 pages to 50 pages.
    I used the same process in SEOing them all.
    Except one (site M), which I have included more KW's in the URL (A little spammy)

    All pages in all sites up to last couple of updates, have or had, 20 - 30 KW's in top SERPS in G .

    In the last 6 months or so, all my sites have taken hits, and seem to be bouncing, some of my less supported Kw's are one day #3, next day not in top 10,000 and a week later back on top of the serps.

    Except "site M" which has been hit hardest, especially those pages that have spammy URL's, which have been dropped from cache, index and serps.
    And now obviously get 0 referrals from G.

    When I do a site: on site M, most of the spammy URL pages are gone.
    BUT there is a page displayed that hasn't existed on the server in more than 2 years!

    If G was running out of space, they would clean old pages that no longer exist, before pages that are current.
    I have heard G is known to archive every piece of data it has ever collected; again I would think they would start dumping old data before Current data.

    In trying to find some common ground
    Is it possible that G is looking harder at over optimization, and KW spam.
    Do your sites that are hardest hit, use a lot of kw’s in the URL?
    Or may be over optimized?
     
    smokey99, May 7, 2006 IP
  8. dudester

    dudester Peon

    Messages:
    4
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #168
    here is a good one for all of us. i thought what better way to find how authoritative the world's self-proclaimed best indexer of information is. here is how. check how many pages of another authority it contains in its own index. you know like genome containing many strings of dna that were verified and built by other authorities etc.

    here we go. LIBRARY OF CONGRESS.

    site:http://loc.gov

    Enjoy your 500-1000 results (including supplementals) with a ****load (oh only about 34 million or so) of unavailable but previously indexed pages. loving it.

    i want to see a syndicated article on this from any decent press agency that isn't too lazy to blame webmasters complaining about google's utter inability to search basic data as sour grapes. of course if you do site:http://loc.gov +"whatever popular word" it will still max you out at 1000 results. and, so will yahoo. i guess, it's also about how data are to be presented.

    my own issue is that i just have 5 of 5 of 600 pages (down from about 65 before BD) of a decent non-spammy site listed with google, with no supplementals or word from anyone what to do to get things going upwards. wish people searched on other engines more.

    wish google competed with hundreds of decent search engines for advert money. cause i think a quarter for 100 clickthroughs on an ad is kinda affordable for me, not 50 bucks. otherwise this whole ecommerce thing is gonna be limited to about well 1000 sites that thrive.

    sorry for adsense people who will be losing their incomes as soon as all other search engines/alternative traffic drivers would replace the adwords as the only means of getting exposure. good riddance search engine traffic from google. oh yeah, and put your money on their and other SE's stocks plummeting, just a matter of time.

    and a matter of time before there would be an open source search engine integrating tons of good algorithms residing on network of pc users and available to be syndicated on any site. i guess that's where traffic will be coming from in the future. i wish p2p indexed data like search engines already actually (why can't they, hello anyone? limewire? soulseek? work on those interfaces now!!!). i know i live in a fantasy world....hehe. well everyone making money of the web right now kinda does.
     
    dudester, May 11, 2006 IP
  9. minstrel

    minstrel Illustrious Member

    Messages:
    15,082
    Likes Received:
    1,243
    Best Answers:
    0
    Trophy Points:
    480
    #169
    Will it be as good as DMOZ? :eek:

    I can hardly wait... :rolleyes:
     
    minstrel, May 11, 2006 IP
  10. dudester

    dudester Peon

    Messages:
    4
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #170
    you misunderstood. i think that current both SE's and dmoz clones are bottlenecks. albeit end users can still find stuff in both.

    but to eliminate the bottlenecks, i think will require to step away from the concept of capacities. what i meant by open source is that a little bit of anarchy would help to sort out the whole "we don't want crap" in our index thing. i mean, there is tons of crap on peer to peer networks and most people are still able to locate what they need. lol. otherwise, why would they be so popular. nobody peer reviews those things, and yet, they are efficient at disseminating information. i don't know.

    my point was that search engine decentralization would help the end user and businesses alike by driving the cost of doing business lower and prices lower and offering more choices. it wouldn't help people relying on adsense for income since the earnings will be proportionately lower as a result of competition between SE's for advertisers income. although, one would probably need to have hundreds of such SE's before the costs would go down in any significant way. i guess, it's ultimately about the SE's reach.

    the difficulty with trying to index everything currently is that there is not enough capacity even for giant companies. party because of spammers (if you ask me they are children of google adsense anyways, so eliminate adsense as viable source of income/or hire bunch of people to track the sites that display adsense and ban owners for life for wrongdoings (since google cuts them checks anyways!!!) and there will be no spam, gasp - i will get killed for this).

    so, that's why my not so original idea of fidning other direct avenues to connect the supplier of information with end user. we see wikis, search comparison engines etc but these are all baby steps imho.

    i digress. i just hate playing by some aribitrary rules that google et al tend to impose on webmasters/stores/everyone with no recourse for those that don't want to play nicely and punishments for those that do....
     
    dudester, May 11, 2006 IP
    GTech likes this.
  11. minstrel

    minstrel Illustrious Member

    Messages:
    15,082
    Likes Received:
    1,243
    Best Answers:
    0
    Trophy Points:
    480
    #171
    I think you misunderstand how Google is structured. Google is quite decentralized - it's a whole network of datacenters accessed according to server loads and geography.

    I don't share your optimism about an open source search engine, no matter how it's organized. If you think spamming and black hat SEO is a problem in existing search engines, multiply that expnentially and you might getsome idea of how much crap you'd have to wade through to find anything at all.

    As for P2P networks as a model, they are already bad enough - the ratio of crap to anything useful there is huge - but try to imagine what that would be like if P2P was a commercial network, selling products and services, instead of swapping pirated images, music, movies, and software - that crap ratio would expand beyind your imagination.
     
    minstrel, May 11, 2006 IP
  12. wisam74us

    wisam74us Well-Known Member

    Messages:
    1,059
    Likes Received:
    47
    Best Answers:
    0
    Trophy Points:
    168
    #172
    I read today in WMW forum a feedback for a member there from GG (one of Google stuff) asking him to fill reinclusion form .
    Just wanna make sure that I know what is the meaning of filling reinclusion form ... isn t that mean that your site is banned from Google
    And for me in some of my website I complain a variation of symptoms
    - Dropping pages and for two websites just homepage left (pages with PR3 dropped)
    - Supplementary pages were dropped
    - Nothing changed in my SERPs
    - Still in the same positions for my main keywords ..
    - Google still visiting my sites and the cashed copy with updated date but without adding pages to his index

    Are those sites banned .. or penalized or affected by the BigDaddy update ,and Shall I fill reinclusion request for those sites
     
    wisam74us, May 12, 2006 IP
  13. Seiya

    Seiya Peon

    Messages:
    4,666
    Likes Received:
    404
    Best Answers:
    0
    Trophy Points:
    0
    #173
    Hmm well i have lost about 5 1 page rankings =P
     
    Seiya, May 12, 2006 IP
  14. Bryan

    Bryan Active Member

    Messages:
    306
    Likes Received:
    15
    Best Answers:
    0
    Trophy Points:
    58
    #174
    Weird... I just had a site that was deindexed get indexed again for a few days with some good SERP's only to be deindexed again a few days later :(
     
    Bryan, May 12, 2006 IP
  15. BrianR2

    BrianR2 Guest

    Messages:
    734
    Likes Received:
    24
    Best Answers:
    0
    Trophy Points:
    0
    #175
    I noticed the same thing. It says pages are blocked by robots.txt but the test page shows that they are allowed.
     
    BrianR2, May 12, 2006 IP
  16. guybrush

    guybrush Peon

    Messages:
    88
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #176
    same here. i tried to contact the G sitemap staff to point this out but havent received any reply yet.
     
    guybrush, May 12, 2006 IP
  17. seopup

    seopup Peon

    Messages:
    274
    Likes Received:
    10
    Best Answers:
    0
    Trophy Points:
    0
    #177
    At me :

    One website from 80K pages index dropped to 43

    Another from 20k to 700+

    In these days we should take a vacation and not look at serps , index :)
     
    seopup, May 12, 2006 IP
  18. wisam74us

    wisam74us Well-Known Member

    Messages:
    1,059
    Likes Received:
    47
    Best Answers:
    0
    Trophy Points:
    168
    #178
    For dropping pages it is ok as we now know that Supp pages will not be include in site:
    So we will see small numbers different from we get used to see in the past, but the problem that really I count serious is :
    Google keep dopping non Supp pages with good PR 2-3 (I can t see them even if I search for specific word in them within the site) and google doesn t index any new pages in the last month

    May be it is time for vacation as seopup said
     
    wisam74us, May 12, 2006 IP
  19. digitalcamera

    digitalcamera Banned

    Messages:
    112
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #179
    They went away and now their back so they reverted whatever crap they did last. Or only did a bit of whatever
     
    digitalcamera, May 12, 2006 IP
  20. seopup

    seopup Peon

    Messages:
    274
    Likes Received:
    10
    Best Answers:
    0
    Trophy Points:
    0
    #180
    Somewhere I saw that Google said they have a "machine crisis" maybe because of this so many pages gone ?
     
    seopup, May 12, 2006 IP