1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Google busting folks hiding via CSS

Discussion in 'Google' started by mdvaldosta, Mar 14, 2006.

  1. #1
    Nice, seems like G is reading CSS files.. I remember a while back there was debate as to whether or not you could get away with hidden text by using CSS.

    Hehe, nice to see you can't :D Boing! I've you've got it you better clean it up :D
    SEMrush
     
    mdvaldosta, Mar 14, 2006 IP
    Greg-J likes this.
    SEMrush
  2. JeremyCade

    JeremyCade Peon

    Messages:
    17
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Aren't you forgeting a little thing called Robots.txt

    it's common practice for untoward webmasters to "disallow" googlebot's from viewing their css files..

    Regardless.. If people didn't try to hide things this wouldn't be an issue.
     
    JeremyCade, Mar 14, 2006 IP
  3. AtoZNetVentures

    AtoZNetVentures Peon

    Messages:
    163
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #3
    how can google automatically confirm that css is actually being used to hide text. Surely a human would need to confirm. For example, if I wanted to hide text i might use css to make my text the same colour as the background which Google could maybe check automatically but what if I use css to make my text the same colour as a background image? I say this because google obviously can't confirm the colour of images from the code as all it sees is the path to that image - this would mean a human must check otherwise there would be a problem with google penalizing sites that aren't hiding their links this way - google can't differentiate right?
     
    AtoZNetVentures, Mar 14, 2006 IP
  4. mcfox

    mcfox Wind Maker

    Messages:
    7,527
    Likes Received:
    716
    Best Answers:
    0
    Trophy Points:
    360
    #4
    I don't think it's automated removal but human reviewed and then removed. Probably discovered when one site in a chain of linked sites was found to have CSS-hidden text.
     
    mcfox, Mar 14, 2006 IP
    Greg-J likes this.
  5. AtoZNetVentures

    AtoZNetVentures Peon

    Messages:
    163
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #5
    i agree with mcfox. but would there be some sort of 'red flag' that could be set off that would bring about a human review or would the site need to be reported via their spam reporting email?

    Maybe google looks out for what appears to be over use of keywords, ie keyword densities that are too high, obvious lists of keywords (lots of the same keyword repeated in close proximity that someone might have used to cover all variations of their phrase).

    Any other ideas?
     
    AtoZNetVentures, Mar 15, 2006 IP
  6. INV

    INV Peon

    Messages:
    1,686
    Likes Received:
    101
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Matt is just doing his job and always like to add a little vouche for the engine as he should. However, he didnt exactly say that googlebot detects these things.

    Thinking about it logiclly, googlebot isnt advanced enough to tackle css at the moment. It's far too easy to write/hide the css in a way the bot will have trouble trying to figure it out and letting the bot handle such things will cause too many problems to even consider. There is of course a way to battle this but the resources would be way too high to tackle this problem at the present time with it's index size.

    The way they probably found the website would be either via a report, or as mcfox said via a trail. However, you have to consider that Matt cutt's job from what I know is analyzing google algo, which would mean he found the website ranking for keywords it shouldnt and saw that was the reason and therefore he flagged it or another member of the team.
     
    INV, Mar 15, 2006 IP
  7. minstrel

    minstrel Illustrious Member

    Messages:
    15,082
    Likes Received:
    1,243
    Best Answers:
    0
    Trophy Points:
    480
    #7
    Actually, in this case the hidden text appeared when CSS was "turned off" - they were using CSS to obscure the text but it wasn't obscured to Googlebot (? or to Matt Cutts?).

     
    minstrel, Mar 15, 2006 IP
  8. INV

    INV Peon

    Messages:
    1,686
    Likes Received:
    101
    Best Answers:
    0
    Trophy Points:
    0
    #8
    The CSS usually would be put in a seperate file with a class/div/tag that has the value of display none or hidden so by turning it off it would show the text to the end user since the cover is off. However, since googlebot reads the text as it doesnt proccess that css display tag it sees it as normal content.
     
    INV, Mar 15, 2006 IP
  9. minstrel

    minstrel Illustrious Member

    Messages:
    15,082
    Likes Received:
    1,243
    Best Answers:
    0
    Trophy Points:
    480
    #9
    That's what I thought. So this isn't a case of Google reading CSS files or CSS files disallowed by robots.txt - this is just a case of Googlebot being a text reader, as usual.

    This is in response to the thread starter:

    and the following post

     
    minstrel, Mar 15, 2006 IP
    INV likes this.
  10. mad4

    mad4 Peon

    Messages:
    6,987
    Likes Received:
    493
    Best Answers:
    0
    Trophy Points:
    0
    #10
    Its quite easy for google to spot pages with loads of keywords stuffed in the footer. Then if they manually visit the page and the text is not visible they will mark it as hidden text.

    Cloaking and using robots.txt to block google won't work as googlebot can easily visit the site and call itself MSIE or Firefox.

    Also think of the sites that are able to take screenshots of websites (alexa for example) and then think how easy it is to use character recognition on images. There is nothing to stop google taking screenshots and using character recognition to analyze the text being displayed.

    Even if this doesn't happen yet you can bet it soon will.
     
    mad4, Mar 15, 2006 IP
  11. seoaddict

    seoaddict Peon

    Messages:
    216
    Likes Received:
    21
    Best Answers:
    0
    Trophy Points:
    0
    #11
    Well Guys.
    It's nice to have like that info.
     
    seoaddict, Mar 15, 2006 IP
  12. Cyclops

    Cyclops sensei

    Messages:
    1,241
    Likes Received:
    72
    Best Answers:
    0
    Trophy Points:
    0
    #12
    I have also read that in that particular case the site was reported by someone and Google responded by banning it.

    Whichever way it goes too many people take way to much notice of what Matt Cutts writes, his ego is really coming into play now. He turned up to a seminar in, I think Las Vegas, and he was treated like a rock star.
     
    Cyclops, Mar 15, 2006 IP
  13. disgust

    disgust Guest

    Messages:
    2,417
    Likes Received:
    133
    Best Answers:
    0
    Trophy Points:
    0
    #13
    why are you assuming google isn't capable of checking the colors of a background image? I don't think it's all that likely they're doing this, but it's not like it's outside the realm of possibility: there's absolutely no reason they couldn't check this automatically if they wanted to.

    it's pretty much the same as hidden text has always been. is it possible to do and get away with? sure. but it's a hell of a lot easier to just do things legitimately and not need to worry about the engines eventually using methods to catch these tricks.
     
    disgust, Mar 15, 2006 IP
  14. Baz@rr

    Baz@rr Well-Known Member

    Messages:
    140
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    123
    #14
    Out of interest, what about review sites and the like which contain spoilers? It's common practice to make the text of these spoilers the same as the background of the page so the user has to highlight them in order to read them. I've seen the same thing done on trivia sites too. Does this mean Google would punish these sites?
     
    Baz@rr, Mar 15, 2006 IP
  15. disgust

    disgust Guest

    Messages:
    2,417
    Likes Received:
    133
    Best Answers:
    0
    Trophy Points:
    0
    #15
    I don't think it's quite as simple as "if this is found via an automated process, your site is banned." it's probably more along the lines of "anything in invisible text won't help in terms of ranking the page for those terms."

    you made an interesting point, though. I'd never really thought of that before.
     
    disgust, Mar 15, 2006 IP
  16. pkchukiss

    pkchukiss Peon

    Messages:
    116
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #16
    In my opinion, Google has not done anything wrong in delisting the site.

    Firstly, that site attempted to game the search engine by stuffing keywords to the search engines. The same text is not viewable by normal browsers. The owner's bluff got called, and now he is accusing Google of censorship.

    I find his claim ridiculous. Google doesn't have an obligation to list his site, and his whining about the delisting only shows how much he depends on Google, as much as he denies it.
     
    pkchukiss, Mar 15, 2006 IP
  17. minstrel

    minstrel Illustrious Member

    Messages:
    15,082
    Likes Received:
    1,243
    Best Answers:
    0
    Trophy Points:
    480
    #17
    And?

    Cutts provides an occasional insider's view of what Google does. That's information that a lot of webmasters are interested in because Google is still the biggest and the best of the search engines. I would suggest that paying attention to what Cutts says is a lot better than the forum rumor mill or watching Toolbar PR and trying to predict the next update :rolleyes:
     
    minstrel, Mar 15, 2006 IP
  18. Cyclops

    Cyclops sensei

    Messages:
    1,241
    Likes Received:
    72
    Best Answers:
    0
    Trophy Points:
    0
    #18
    Oh come on Minstrel, everything he says has been mentioned here many times before he blogs about it.
    It's not even an insiders view, it's common knowledge, it's just that when he says something in his blog all the disciples jump on the bandwagon and rave on like it's another commandment.
    I'm getting too old to be sucked in by all that crap.
     
    Cyclops, Mar 15, 2006 IP
  19. minstrel

    minstrel Illustrious Member

    Messages:
    15,082
    Likes Received:
    1,243
    Best Answers:
    0
    Trophy Points:
    480
    #19
    That's complete BS, Cyclops. The most important posts from his blog are often debunking persistent (but stupid) myths circulating on forums like this one and gobbled up as truths because someone saw it somewhere and after 10 repetitions it is accepted as a "must be true!" item.
     
    minstrel, Mar 15, 2006 IP
  20. mad4

    mad4 Peon

    Messages:
    6,987
    Likes Received:
    493
    Best Answers:
    0
    Trophy Points:
    0
    #20
    A lot of the stuff he talks about has been discussed for months on seo forums but its always nice to have theories confirmed from the horses mouth.

    Whatever we may think of Matt and google they are the ones controlling the search engine so I always make myself read anything they write whether its the webmaster guidelines or Matts blog posts (even the crap ones).
     
    mad4, Mar 15, 2006 IP