How do you get a website back from the dead

Discussion in 'SEO' started by t2dman, Apr 6, 2004.

  1. #1
    We have a very peculiar Google.

    My main website - top of Google for over 200 terms, PR6, hundreds of backlinks, 9000+ visitors per month, standard SEO.

    Apply the standard SEO to other sites who are blacklisted by Google = nowhere to be found for their seo'ed phrases.

    How do I try to get the site back - applied standard SEO, culled old recip links out, gave it heaps of PR from my PR6 index page, created a sitemap on my site back to theirs to give text links and PR. Only one link from them back to me - on an inner page I have not linked to. Added links from other partners, added relevant links out on every page ...

    And the site is not back from the dead a month after doing the above.

    PLEASE!!! - Is there anyone out there who would like to review the site and offer any suggestions.

    http://www.atthebeach.co.nz
     
    t2dman, Apr 6, 2004 IP
  2. hans

    hans Well-Known Member

    Messages:
    2,923
    Likes Received:
    126
    Best Answers:
    1
    Trophy Points:
    173
    #2
    http://www.atthebeach.co.nz
    Your site is ranked as 3/10 !

    Pages linking to your site:
    No links to your site found.

    no links means no site linking to you having PR4 or higher - no matter how many personal homepages link to you - only high quality sites ever count.

    a tip
    have your site listed in a few lelevant local or national directories


    A N D !!!

    P L E A S E how can google or any other bot search/crawl a site that has NO VALID code ??

    http://validator.w3.org/check?uri=http://www.atthebeach.co.nz

    of course any normal bot simply crashs or fails to supply valid results to the db

    your site is missing the character encoding part of your meta tag and in a global multi lingual db such as Google that IS a vital part to allow a bot to return correct data to its home-db !!!

    here the rest - an updated final version is available at the link-location below.
    a linbk report has been sent to you by separate email - a quick look showed only 1 link gave a 404
    ------------------------------------ main problems have BEEN:


    This page is not Valid HTML 4.0 Transitional!

    how can such salad occur ?

    that is easy - very easy and happened many times to me again and again.

    by

    changing OS
    updating OS and changing major character set configurations
    working on pages on 2 or more different PCs
    changing generator used for the creation of web sites
    changing system character set configurations

    best is always ONE OS - ONE generator and every page a HTML syntax verification !


    just keep the law of karma in mind - when you confuse and kill a bot - then your site gets killed as well :) !!!!

    this page will be removed after a few days !

    God bless
     
    hans, Apr 7, 2004 IP
  3. mcdar

    mcdar Peon

    Messages:
    1,831
    Likes Received:
    110
    Best Answers:
    0
    Trophy Points:
    0
    #3
    t2dman,

    Nice looking site!

    I ran your site through this tool Keyword Analysis Tool and it does seem that your lack of PR>4 links to your site is a major factor.

    You are competing with some big sites but some small sites are ranking with as few as 6 Google backlinks reported.

    You may want to focus your attention on acquiring some backlinks from pages with PR 4 or above.

    Caryl
     
    mcdar, Apr 7, 2004 IP
  4. t2dman

    t2dman Peon

    Messages:
    242
    Likes Received:
    17
    Best Answers:
    0
    Trophy Points:
    0
    #4
    Thanks for the pointers - BUT - How long does it take for backlinks to show, and PR to be updated? - MORE than a month! The site will be a PR5 when PR updated. I made mention of the links coming in and a good number of new links - see the "sites of interest" page, a majority of them link back. There is also a new DMOZ link. Of particular interest was a site map page I put up on the t2d site - it ranked higher for a search phrase than the atthebeach site.

    This is not a simple problem able to be solved with a bit of code validation (which I will do anyway). I quoted the experience I have with Google.

    There is a "very peculiar Google" out there, where it is very easy to do the "wrong" thing and get your site effectively blacklisted. Does anyone have experience with getting a site back that has been "Blacklisted" - dumped for its search phrases to 80+. Simple SEO just doesn't do the trick.
     
    t2dman, Apr 7, 2004 IP
  5. compar

    compar Peon

    Messages:
    2,705
    Likes Received:
    169
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Well it depends on what you call "simple SEO". In my books this includes getting backlinks. I haven't even looked at your site but from what others have said I assume you are shy on backlinks. That is the "simple SEO" that will bring you back from the dead.
     
    compar, Apr 7, 2004 IP
  6. hans

    hans Well-Known Member

    Messages:
    2,923
    Likes Received:
    126
    Best Answers:
    1
    Trophy Points:
    173
    #6
    the code validation is far more simple than ou think
    wrong code means you kill the bot - a death bot is doing NOTHING at all for you ! no matter how good your site looks or how many links are hidden behind killer-code ..

    and the bot is NOT going to index your site .. and this has absolutely NOTHING at all to do with Google policy or whatsoever ... any death bot is useless

    it is as simple as that
    a site NOT correctly and NOT fully indexed by a bot means no matter how many thousand back links you have
    the bot ain't going to find ANY of them ( or a few only ) because you killed the bot

    you fully confused W3C and their bots and parser certainly are used to many kind of wrong code !
     
    hans, Apr 7, 2004 IP
  7. compar

    compar Peon

    Messages:
    2,705
    Likes Received:
    169
    Best Answers:
    0
    Trophy Points:
    0
    #7
    Hans I've never disagreed with you before. And maybe it is just the language barrier, but the bot doesn't have to crawl your site to find the backlinks to your site.

    The backlink are on other people's sites, so they have nothing to do with the bot crawling your site. I believe if you have enough backlinks pointing at your page Google will rank it high no matter what is on the page. There is all kinds of evidence for this. Look at all the Google bombs that have resulted in pages ranking #1 for information that was completely foreign to the page. The most famous is probably Miserable Failure, which for months and months put Geogre Bush's official White House Biography in #1 place. Even today this search still puts that page in #3 place.

    Do you think that it only put that page in third place because the page is WC3 compliant? I doubt it.

    Yes content and on the page stuff can get a page ranked for non competitive keywords. And under those situations the bots need to be able to completely crawl your page. But links with the correct anchor text can get any page to the top, of the SERPs.

    BTW the bots have been happily crawling pages that won't validate under WC3 for years.

    So I still say that what he needs is simple SEO backlinks. And lots of them.
     
    compar, Apr 7, 2004 IP
  8. t2dman

    t2dman Peon

    Messages:
    242
    Likes Received:
    17
    Best Answers:
    0
    Trophy Points:
    0
    #8
    I typically get pages on my site to the top of their phrases in 1 day - one link from pr6 page, one from pr5 page. There are plenty of links to atthebeach, it is more complex than that.

    I agree that W3C is totally unnecessary - that each of the pages were cached by Google within 2 days of them being put up speaks for itsself. That several of the pages are first for their term reiterates it.

    I know of other sites pages that have been dropped by Google, that Google has replied that there is "no penalty" - they are top for their phrase in very competitive terms on Yahoo, and 80th+ on Google.
     
    t2dman, Apr 7, 2004 IP
  9. hans

    hans Well-Known Member

    Messages:
    2,923
    Likes Received:
    126
    Best Answers:
    1
    Trophy Points:
    173
    #9
    Compar

    i fully agree

    i just took too little time to explain all in full in all relevant details - a 24 hrs day is far too short for me as such reviews take up to many hours or more .. besides my own work for my own site

    but of course the site's back link are on OTHER sites - that's obvious and you are fully right in that

    but the site itself can NOT be parsed - at least NOT full page or all pages / indexed if the bot is getting
    screwed up by some weird or non default of wrong bytes on the page itself - such as happended repeatedly to w3c bot.

    the pages itself may be known to SEs, part of the text as well - but part or all of text may be UNused for searches as real text search may fail to said incompatibilities of characters/bytes used in such files itself.

    in such extreme cases a bot may read part of text and stop at first point ( character / byte that causes confusion ) or abort index after a repeated number of attempts ..
    it is obvious that high end bots such as GooglBot NEED to have a time out built in else they get lost and waste their time - specially if you consider the fact that Google bot appears to fully check each site approx 3 times+ each months as opposed to Yahoo who may do one run per months and others who do just a fraction of a site each months on a random basis or however they select pages being re-indexed

    to have a site UP in high ranking results for each page - each pages must be known to the bot / Google ( in this case ) as the text itself is stored in G cached ( at least for some sites ) and then used for scanning for text or display from cache for search users.

    in my precise above site review case of named domain there was no output at all of w3c at first because of "fatal errors" and after correcting part - then at least partial results were available with error output

    the first error message received form w3c was a
    ""
    meaning NO error output at all - just fatal error


    there surely is a difference between what a browser can display ( high end browsers such as Mozilla / NS are extremely
    forgiving ) - a browser has some 10 - 30+ MB code including many work-around to compensate for pages with wrong HTML
    code - missing tags etc to still make a human readable display of data form that page

    while a bot crawling the text in high speed may be far less complex and follows correctly all links until an end f page is
    found or a unsurmountable obstacle in code with known obstacles being worked around - such as flash ( strange unknown
    code or data errors, byte errors, or gross mistake ) while a browser may have plenty of
    if then else options built in
    - an average bot most likely/surely has far fewer and depends far more on accurate syntax of correctly declared data
    on each site. having spend some time regarding multilingual data presentation on single pages i found that most www
    tools still are far from ideal to crawl/parse/display multilingual characters / pages - hence many popular sites needing
    multiple character encodings for a single page work around such problems by including such words in GIF
    ( or dynamically created GIF(PNG). that problem is nothing else but the problem of having different character encodings
    created by different web design tools who just accidently use different character code definitions - like main text having
    been designed / written in iso-8859-1 and a minor correction having been added to same page using another tool/editor
    that just happens to use UTF-8 encoding - we then may find a salad of different encodings confusing even some browsers -
    even more bots or other tools.


    myself i do have incorrect display on my system sometimes since i get visitors from 110+ countries some of these eMails or
    even access_log entries are sometimes ( at least for single data sentences or words or characters ) unreadable.

    some - a few only - but regularly - of my access_log entries are NOT readable, some can NOT be fully displayed, some can
    NOT be parsed by various software using and processing such data for proper human readable output ... some are readable
    but NOT displayable in ICQ, email client or browser ... at least single characters or words ... such happens each months
    repeatedly to me at least.

    even on correct pages there are plenty of bots who fail to deep crawl - not at all because of its policy - but because it fails
    to continue correctly at certain points of a page or site and either loops or just hangs or fails ..

    this certainly may apply very specially when charset encoding PLUS html DTD are mixed within a site

    one particular set of bytes may give TOTALLY separate alphabetical meanings only depending on char set config of page ...
    see UTF-8 needing a separate page for one group of characters ( language ) offering each ONE precise range of hex
    addresses within the full UTF code ( probably UTF-32 - or UTF-64 if it exists - may give all at once in one single
    character code set for all languages worldwide )

    may be GW Bush gets an extra G bonus and may be we all should become at least for a
    few days president :) to receive full advance pardon for any mistake we make on earth ..


    and to finalize this entire topic:

    as i originally wrote my post in "post quick reply" mode .. then it became to large and i wanted to run it trough spell check
    in another
    editor -- hence "copy and paste" ...
    and

    ~/A~/Aa
    another attempt copy and paste of THIS post gives a paste result of
    4/A4/A�d

    that above was the result of paste of THIS post text from browser window to regular editor !
    just a practical display of the real complexity of global data processing of data from various sources created in various
    different methods/character encodings.

    repeated attempts failed. why ? a mixture of various signs/bytes/characters may confuse even hardcore systems.
    a single displayed word or line or paragraph consists of VISIBLE bytes being displayed for reading ( letters or signs )
    and may as well consist of almost any number of INVISIBLE bytes of text ( code ). the most common invisible bytes used are
    LF, CR, right to left ( arab ) but many others as well PLUS potential error-strings accidently created or happened.

    when copied only a few paragraphs at a time it all worked - it always worked n full as well earlier. this time it just was another
    +piratical proof of the complexity and mixture between INvisible and visible content of ANY file - no matter if html, txt, .doc
    or whatever format !

    just a few days ago i received an email ( most likely arab origin ) and any attempt to reply to the email always resulted in
    MY email client (evolution) writing right to left ( arab way ) ...

    there was absolutely no way to change that behavior in my mail-client except to start a new mail and copy his
    address and subject line - instead of clicking reply ..

    that encoding (right to left) was somehow included invisibly in his mail-text-file - persistently overriding my own
    editor configuration ..

    however such is actually a topic far beyond my precise knowledge - being in computing only since the early1970ties
    and the 30+ k hrs spent mostly as simple user or application programmer (Cobol) hence in languages far from machine
    code ( assembler ) may be someone CLOSER to actual processors - such as real assembler programmers knows more details
    of this topic. as all strings finally need to be processed by processor and go through machines in far different way than html
    or ASCII - byte by byte - including all strings in SEs.

    this topic exceeds my expertise and i should probably never have done the site review nor the post as there are many other professional SEOs with longer and more varied expertise and experience as myself.

    when t2dman says:
    "I agree that W3C is totally unnecessary.."
    may be he is fully right ..
     
    hans, Apr 8, 2004 IP
  10. t2dman

    t2dman Peon

    Messages:
    242
    Likes Received:
    17
    Best Answers:
    0
    Trophy Points:
    0
    #10
    Thanks Hans for your views. I have now validated the site and added a few extras. The PR and backlinks now updated to PR5 and backlinks 80 (index page). Still ranked 352 in Google for "Christchurch Bed and Breakfast". Some pages are top. Will be interesting to see what happens over time.
     
    t2dman, Apr 10, 2004 IP
  11. hans

    hans Well-Known Member

    Messages:
    2,923
    Likes Received:
    126
    Best Answers:
    1
    Trophy Points:
    173
    #11
    yes -
    but now ..

    Christchurch Bed and Breakfast

    http://www.digitalpoint.com/tools/suggestion/?keywords=Christchurch+Bed+and+Breakfast

    gives
    christchurch bed and breakfast 3.0 /day
    christchurch bed breakfast 5.7 /day ( overture )

    and a few of these may be from your and my testing

    hence

    when people select their keywords they focus on
    they may drop their personal brand-keywords

    and rather learn to think popular,
    HOW do OTHER people think and query ?
    what exact words is the potential customer using when querying a SE ?

    above results show that there is virtually no one using these precise strings/terms
    hence it might be better idea to search NEW words you want to focus on to increase you customership

    of course you may have to count for local traditional or cultural terms NOT included in above wordtracker or overture results.

    every country may have a variation of international english - some local / national terms for a particular service or products.

    whatever - if you want to be more ATTRACTIVE by being found - then think more global ...
    christian 2,292.0 /day
    bed and breakfast 2,377.0 /day
    christian bed and breakfast 19.0 /day
    christian camp 176.1 /day
    christian youth 124.0 /day with variations !
    family christian center 13.0 /day
    christian vacation 24.0 /day

    and out of above REAL used terms
    you may SEE what is search / needed
    and may want to fill a true NEED of God seekers ..
    by
    adapting keywords
    and adapting your services to needs - and new keyword focus ...

    and you may get more people than you can accommodate ...

    people are lost - some of them - when they look out for your services for God - may be they need MORE than bed and breakfast -
    and then be on the road ( to where ...? ) again ... and if you offer some real content - in your keywords as well - content beyond and above bed and breakfast ..
    then you may find more people coming and gathering there at your nice place

    bed and breakfast sound like for the homeless - one night bed and breakfast - and then a kick and out on the road - homeless again ..
    i have seen thousands of such in Munich Germany - even in cold winter and minus 10-20 C ...

    this just in case you love to have a few more people joining you there .. your site looks nice, the location also ..
    all that might be missing is some more attractive content of life to actually GO there. God is love - Jesus as well -
    life can be exciting and fun - for all !

    your place looks like offering far more than the keywords you appear to focus on ..

    the popularity of a site may be limited simply by the wrong keywords in use
     
    hans, Apr 10, 2004 IP
  12. t2dman

    t2dman Peon

    Messages:
    242
    Likes Received:
    17
    Best Answers:
    0
    Trophy Points:
    0
    #12
    Thanks for your suggestions Hans.

    The site is in the business of promoting itself to sell a product in a New Zealand city called "Christchurch". You will find that when you enter "Christ church bed and breakfast" even Google suggests "Christchurch Bed and Breakfast".

    The search site tools you suggest are very US centric and very wrong - although they do have a use. Example - The search term "Auckland Restaurants" shows 28 per day on digitalpoint, I got 96 per day in March 04 per GoogleAdwords and over 9000 visitors to the site for the month using over 200 terms. We are also looking at a population of 4 million in NZ of which over 1 mill reside in one city. Christchurch only has 300,000 people - you don't get many people searching. So why use the tool - shows what terms are more popular, while you can't trust the numbers.

    I am a believer, and a firm believer that visitors are no good unless their being there achieves something. The same goes for web pages. Why have a page top for a term that will only get looked at by people who will never be interested in buying your product? Its a little easier for religious sites where potentially everyone is interested in religious ideas. My time costs money, and I am interested in getting the Christchurch Bed and Breakfast site to the top for its terms. You create a site where every page has a different popular term that it ought to rank top for. Therefore you have a site that does not just depend on one search term.

    Can we get back to the point
    How do you get a site back that Google has dropped?
    • site now validated (although irrelevant if Google can cache)
    • site now PR5 with 80 backlinks
    • optimised specifically for different search terms for each page...
    More to the point, there are other sites that the same thing has happened to, therefore in answering this question, we are helping many other people. A very Christian thing to do.
     
    t2dman, Apr 11, 2004 IP
  13. calloptionds

    calloptionds Peon

    Messages:
    199
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #13
    Your site is ranked as 3/10 !
    Pages linking to your site: No links to your site found.
     
    calloptionds, May 31, 2010 IP
  14. enrike

    enrike Peon

    Messages:
    14
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #14
    you need to change title
     
    enrike, May 31, 2010 IP
  15. SEOho

    SEOho Peon

    Messages:
    11
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #15
    That's a great idea though, seeing how domain age seems to be so important now.
     
    SEOho, Jun 22, 2010 IP
  16. aweseome

    aweseome Guest

    Messages:
    69
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #16
    need to do a lot of backlinking including SE submissions and bookmarking.
     
    aweseome, Jun 25, 2010 IP
  17. touchserv

    touchserv Guest

    Messages:
    65
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #17
    I am a believer, and a firm believer that keyword research is the best tool. I have been using Keyword Research Pro for a while now and I must admit that this tool is amazing to getting your website at the top of Google. It generates keywords that you can build your site around. Essentially, it is just another SEO tool, but it can really help your website get higher in search engine rankings, getting more traffic to your website. This is a great tool with a couple of great features, and is worth every penny.
     
    touchserv, Jun 27, 2010 IP
  18. ablaye

    ablaye Well-Known Member

    Messages:
    4,024
    Likes Received:
    97
    Best Answers:
    0
    Trophy Points:
    150
    #18
    You should sign up with Google webmasters at http://www.google.com/webmasters
    You'll be able to see your backlinks and any broken links that your site might have.
     
    ablaye, Aug 9, 2011 IP