1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

What can be the matter?

Discussion in 'Websites' started by Owlcroft, Dec 4, 2004.

  1. dazzlindonna

    dazzlindonna Peon

    Messages:
    553
    Likes Received:
    21
    Best Answers:
    0
    Trophy Points:
    0
    #21
    so owlcroft, since one of your seo tools is a similar concept (the books thing), do you see the same kind of penalty being applied to it as well?
     
    dazzlindonna, Dec 6, 2004 IP
  2. SEbasic

    SEbasic Peon

    Messages:
    6,317
    Likes Received:
    318
    Best Answers:
    0
    Trophy Points:
    0
    #22
    I don't think it would.

    With enough content from enough different places, you can hide pretty much anything.
     
    SEbasic, Dec 6, 2004 IP
  3. Owlcroft

    Owlcroft Peon

    Messages:
    645
    Likes Received:
    34
    Best Answers:
    0
    Trophy Points:
    0
    #23
    I haven't even looked, being so busy trying to keep the encyclopedia running freely, but I have had one user report to me that he is seeing a material decline in pages indexed.

    The answer there, too, is to meet idiocy with idiocy by artificially padding each page with unneeded garbage, to keep the teeny tiny "minds" at Google happy. ("Unneeded garbage" doesn't mean gibberish, or totally irrelevant material--it just means material that is related, but by no reasonable standard does it _need_ to be on the page.)

    This is yet another in the seemingly endless parade of contra-logical inanities from the Fine Folks who brought you "links are votes" as an idea. But, as Ernestine always said, "We don't care--we don't have to."
     
    Owlcroft, Dec 6, 2004 IP
  4. Owlcroft

    Owlcroft Peon

    Messages:
    645
    Likes Received:
    34
    Best Answers:
    0
    Trophy Points:
    0
    #24
    I sent a reply email to Google's form reply; in it, I pasted their entire "Guidelines" text, and addressed each one individually, saying things like EXACTLY DONE or, here and there, adding a note (such as on "duplicate content").

    I have now received a reply email stating:

    We understand your concern and have passed your message on to our engineering team for further investigation.
    Whether that is meaningful or just a variant of the old "Send this nut the bugs letter" joke, I cannot say, but at least it's different content . . . .
     
    Owlcroft, Dec 7, 2004 IP
  5. exam

    exam Peon

    Messages:
    2,434
    Likes Received:
    120
    Best Answers:
    0
    Trophy Points:
    0
    #25
    I finally decided to throw in my $0.02.

    On the pages I looked at on your site it the title says "Search for: -" If you fix that to display the word that was searched for, that's one other thing you can do to make each page different.

    On to my main idea of how to make the site more useful to the users (This I think is the single most important aspect of SEO)...

    1. Figure out some way of creating an algorithm that will "rate" or tell you the "relevance"of both the wikipedia articles and the odp links. Maybe you can use your 1000 google API queries per day with the first 1000 user searches on your site to get the PR of the actual wikipedia article pages and so that you can order the articles by relevance or importance and then you can display a 300 word excerpt (sp?) of the first two articles followed by a list of the next five most relevant articles and if you want follow that with a link to "more". In the same fashion, the dmoz links can be "ranked" using the g api - maybe with page rank (can you get PR from the api?) or maybe by comparing where the dmoz links show up in the serps (relative position) and only display the first 10 links along with a page excerpt. (You can have links to more dmoz entries if you want)

    I think some changes like this would help make the page more useful to the user (a search would be more effective and quicker while still allowing access to all the results if the user wants it), thus making the site better for the robots.

    Just some rambling on my part...... :)
     
    exam, Dec 7, 2004 IP
  6. Owlcroft

    Owlcroft Peon

    Messages:
    645
    Likes Received:
    34
    Best Answers:
    0
    Trophy Points:
    0
    #26
    Interesting ideas. I don't know if I have enough data per article/topic to be usable, but I'll look.

    Meanwhile, I have done two things: one is that on the bulk of the directory-index pages, the 10,000 "low-level" ones (the ones that list article direct), I have tagged on a randomly selected article from the list, so that there is substantial extra, always different content from one to another of those 10,000 pages (and even different on each viewing of any one). I also sent a further email to Google, with a copy of my .htaccess file attached, as a full-disclosure effort to show 100% good faith.

    Well, we will see what we see.

    Oh, yes: I also checked all the links I have been giving people, and dropped the few that were PR-zero pages, which is probably unfair to those folks (whom I will soon notify individually by email), because they're probably all good scouts with new pages/sites--but I wanted to eliminate all chances of being seen as linking to any of those dreaded "bad neighborhoods". (If anyone knows a better identification of "bad-neighborhood" pages than PR zero, please speak up!)
     
    Owlcroft, Dec 7, 2004 IP
  7. T0PS3O

    T0PS3O Feel Good PLC

    Messages:
    13,219
    Likes Received:
    777
    Best Answers:
    0
    Trophy Points:
    0
    #27
    If they're in the index but PR0 they can be perfectly legit. If the bar is grey with PR0 and not in the index - that's when you should start worrying about this.
     
    T0PS3O, Dec 8, 2004 IP
  8. Owlcroft

    Owlcroft Peon

    Messages:
    645
    Likes Received:
    34
    Best Answers:
    0
    Trophy Points:
    0
    #28
    One of the few drawbacks to not running M$ products is not having tools written espressly, and only, for them. I have Firefox 1.0 (under OS/2) with the so-called Google Pagerank Status Extension installed, which seems to show PRs reliably, but I wouldn't go so far as to say it shows the well-known grey/white distinction on PR0 pages (it *may*, but I don't know).

    But yes, they may very assuredly be 100% up and up, and I am bitterly aware that I may be acting unfairly to the few pages at issue.
     
    Owlcroft, Dec 8, 2004 IP
  9. a389951l

    a389951l Must Create More Content

    Messages:
    1,885
    Likes Received:
    65
    Best Answers:
    0
    Trophy Points:
    140
    #29
    Hey Eric don't worry. I noticed that you dropped one of my sites that is PR0 - my interior pages have PR though strange. No hard feelings - of course I had to drop your link from that site.
     
    a389951l, Dec 8, 2004 IP
  10. dazzlindonna

    dazzlindonna Peon

    Messages:
    553
    Likes Received:
    21
    Best Answers:
    0
    Trophy Points:
    0
    #30
    I brought up the books tool because I have it on one site and all the pages are starting to disappear into the url only type. Just noticed they return header status 302. Are they supposed to? Something seems wrong about that - but I'm no expert on header statuses. So...question is...are the pages that are dropping for you also returning 302's, and if so, could this be the problem?
     
    dazzlindonna, Dec 8, 2004 IP
  11. Owlcroft

    Owlcroft Peon

    Messages:
    645
    Likes Received:
    34
    Best Answers:
    0
    Trophy Points:
    0
    #31
    I brought up the books tool because I have it on one site and all the pages are starting to disappear into the url only type. Just noticed they return header status 302. Are they supposed to? Something seems wrong about that - but I'm no expert on header statuses. So...question is...are the pages that are dropping for you also returning 302's, and if so, could this be the problem?

    The 302 is a result of the mod_rewrite of a dynamic URL to a static URL. I cannot imagine that that is a problem for Google, but I may have a limited imagination: they have so very many ways of being antic.

    I need to put this in a separate post, but if anyone using the Freebie books-list package is seeing pages disappear from the count, here is a quick, simple bandaid fix:

    change the line in your .htaccess that currently points to the script free1.php to refer instead to free2.php

    That will cause the secondary script, the one with as many reviews ("Editorial" and "Reader") as Amazon has, to appear first, instead of being a clickable follow-on to the original reviewless page. Its drawbacks are two: one, not every book has reviews (so this is not a foolproof way of distinguishing every page), and two, for popular books--with, correspondingly, many associated reviews--it can greatly increase the page-load time (but all the important material is atop the page, so that's not too bad either).

    a389951l: I understand perfectly. I doubt me much that any of the five or six links I dropped is anything but 100% clean, and I hate to harm both the link target and myself, but that is what Googlery brings us to. Why these pusillanimous putzes think it's OK to penalize sites yet never reveal why or for what they do so is very unclear, unless it's that they know their criteria are so bad that they fear legal action if they ever actually disclose their nominal grounds for action. That "we need to keep our alogrithms secret" flummery can only be stretched so far in trying to cover sheer ineptitude.
     
    Owlcroft, Dec 8, 2004 IP