New site - indexed, nothing in sitemap google

Discussion in 'Search Engine Optimization' started by northpointaiki, Dec 4, 2005.

  1. #1
    Hello - newbie. My site is indexed and on page 2 of MSN for a keyword; I have google sitemap uploaded and working; my weblog shows googlebot crawls regularly; and my google account shows that my home page is indexed (though that is the only indexed page). Google shows no other data - no query stats, crawl stats, or page analysis stats. Additionally, there are no error stats. What I get is:

    Your pages in Google

    Data is not available at this time. Please check back later for statistics about your site.

    My site has only been live a few weeks now. Is this just normal, or does this sound like I have been consigned to Google hell for something I don't know?

    Any help would be appreciated.

    Thanks.

    Paul
     
    northpointaiki, Dec 4, 2005 IP
    lightless and GTech like this.
  2. frankm

    frankm Active Member

    Messages:
    915
    Likes Received:
    63
    Best Answers:
    0
    Trophy Points:
    83
    #2
    Yes that is normal: for new sites it can take a few weeks for data to be available in the sitemaps-admin-page. Just be patient and check again in a few days.
     
    frankm, Dec 4, 2005 IP
  3. hans

    hans Well-Known Member

    Messages:
    2,923
    Likes Received:
    126
    Best Answers:
    1
    Trophy Points:
    173
    #3
    you still have your access_log statistics to see how many bot visits from each major bots you had so far
    hence you know what the future will bring

    of course you have to assure that your google sitemap is validated else google wont do anything at all with your sitemap
    and you have to assure all URLs are in yoiur sitemap
    make daily updates of sitemap and control if Google downloads them also daily
    the sitemap page on google always shows you when the sitemap was downloaded last time
     
    hans, Dec 4, 2005 IP
  4. northpointaiki

    northpointaiki Guest

    Messages:
    6,876
    Likes Received:
    187
    Best Answers:
    0
    Trophy Points:
    0
    #4
    Thanks, to both of you. Hans, the site is validated, the sitemap is set to daily, and so far, at least, with every fresh google crawl of the sitemap, status indicates as O.K.

    My web logs do show google as being one of the "biggest" crawlers, as evidenced by kBytes; which prompted my concern. I am getting crawled quite a bit by googlebot, at least vis a vis other things, and yet nothing is yet showing up. Just wondered if something was wrong or not. The good news is that, as I said, I am on page 2 for MSN.

    Thanks again.

    Paul
     
    northpointaiki, Dec 4, 2005 IP
    gauharjk likes this.
  5. hans

    hans Well-Known Member

    Messages:
    2,923
    Likes Received:
    126
    Best Answers:
    1
    Trophy Points:
    173
    #5
    northpointaiki

    a few months ago i made a brand new site for a resort on an island
    it took less than a week to be in google
    here how i proceed in such situations

    1.
    i have my own blog on an "old" - where i document the creation of new pages with link to the new pages on the remote domain
    that blog of course has an RSS feed that is submitted to a bunch of RSS directories and Y as well ( Y by ping )
    hence G finds these links "all over the world ..."

    2.
    then i have a script to submit to some 20 or so minor SE and directories
    minor but OLD and reputable ones with high PR
    some of them index and list within a day or 2
    that instantly creates backlinks to all new pages for a new beginning domain
    G visits at least some of these minor SE daily
    for example
    http://www.searchsight.com/
    (SS accepts each page individually and they are instantly crawled and online for G !! )

    to submit to these 20+ minor SE/directories
    you either can use some of these "submit for free" sites
    or have a perl script on your site just for you
    like that one
    http://www.kriyayoga.com/cgi-bin/submit_url/submit.cgi
    ( you may use it of course - some of the "error/failed" messagtes are wrong - just click and see the details)
    simplesubmit comes from
    http://www.verysimple.com/products.php/simplesubmit.html
    and you need a NEW updated submit database

    3.
    i have almost every single page linked from domain root
    /index.html page
    i have a regular html sitemap - also linked on the domain root

    4.
    and of course i submit a Yahoo sitemap to yahoo

    all combined a.m. done at once
    usually each page is in G after 3-5 days

    if all pages fully validated
    then i assume your site gets into G as fast as mine

    Good luck
     
    hans, Dec 4, 2005 IP
  6. northpointaiki

    northpointaiki Guest

    Messages:
    6,876
    Likes Received:
    187
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Hans, thank you. As I'm a newbie, please forgive the newbie questions.

    Questions on your numbered procedures:

    1) " have my own blog on an 'old'" - do you mean, a blog site that has authority, due to its age? Can you provide some guidance as to how you create links on something like this? And how to provide updates on the remote domain?

    2) I think I have done something like this on www.self-promotion.com - at least, it looks similar. Asks for boilerplate information, fills in on a host of indexes, directories, and search engines - and you can review each entry to ensure it is correct before sending away. Using this site, I have been promoted to: Alexa, BigFinder, Bivlo, Cipinet, EntireWeb, Google, Scrub The Web, TurnPike; Aeiwi is not going through, for some reason. Checking which engines have me indexed, I see I am indexed on:
    AllTheWeb, AltaVista, Excite, Google, MSN Search, and Yahoo. Doesn't appear I am yet indexed on DMOZ/Open Directory or Lycos, although these were submitted. Also, I initially submitted only my index page. I very recently multi-submitted other important pages.

    3) I am unclear what you mean by your Number 3. Do you mean that almost all of your incoming links link to the index page, or, within your site, that from your index page, you internally link from that page to almost all other pages of your site? If the former, any link attempts I've made go to my home page - although, the link is to www.a1-outdoors.com, not www.a1-outdoors.com/index.htm. I do have a link on my home page to my HTML sitemap (actually, it is part of an includes which is on every page).

    4) I was not aware you could submit an xml sitemap to Yahoo as well. I will do this.

    Thanks - if what I've written above is sorely skewed, appreciate any corrections.

    Paul
     
    northpointaiki, Dec 4, 2005 IP
  7. hans

    hans Well-Known Member

    Messages:
    2,923
    Likes Received:
    126
    Best Answers:
    1
    Trophy Points:
    173
    #7
    1.
    a very active blog existing since 1+ yrs, visited daily by G and Y with an automatically created RSS feed visited by major RSS bots several times daily or several times hourly - as is usual for RSS bots
    just have a look at some or my blog "secrets of love" in below sig links
    a blog eaasily can be used to comment on work done - as a help for others doing similar work or as additional promotional tool to promote other excellent sites, services or products or the oingoing process of a new site you make for a friend
    you provide updates on a remote sites by describing briefly the new products or services offered by that remote business and linking from within the blog text to those NEW pages created - just as done above a few times. hence you provide NEWS just like any other news service - online on your blog with links to original sources and references
    since blogs and combined RSS feeds are fresh content- they are visited far more frequently than most static pages - just like a good forum such as DP is visited daily to grab all fresh content ...

    2.
    i also use selfpromotion.com ( paid version ) since 5+ years
    that is similar
    but NOT intended to be used for individual pages - only for major subsections of a site
    the "submit for free to ..." usually is submitted to old fashioned SE or directories who never crawl the page but only grab the metatags like in old days
    for that reason most of the SE on my own submit tool mentioned grab the metatags instantly and publish instantly - or with minimal delay of 1-2 days
    it is less work as it does require NO preparatory work such as done for all SE when using selfpromotion.com's services

    DMOZ needs manual submission !!
    usually ONE URL per domain unless you have a large/huge domain then 2 or more URLs ... DMOZ may take up to months or a year after submission until published in the directory. however if done correctly even within a few week is experienced by some

    3.
    i mean:
    internally link from that page to almost all other pages of your site

    may be you just look at my domain index page to see what i mean

    in an earlier post you wrote "...the site is validated"
    until now i never had your URL
    and hence
    i see that validation is but your merry christmas wish to occur sometimes later ...
    santa is most likely never going to do it for you
    i safely assume its your own job to really do it - if clean code is your goal for easiest and safest quality control in SEO

    just see what w3.org thinks about validation of code on your pages
    http://validator.w3.org/check?verbose=1&uri=http://www.a1-outdoors.com/
    you may also solve the CSS problems
    http://jigsaw.w3.org/css-validator/validator?profile=css2&warning=2&uri=http://www.a1-outdoors.com/

    there may be controversial philosophies in regard to which tool or generator to use when creating a site
    you apparently prefer microsoft
    so far in my limited (8+ yrs) www publishing-experience i have never seen one single fully validated page ever created by any microsoft product - neither frontpage nor your ms office ...
    if you correct manually
    and may be remove
    xmlns:mso="urn:schemas-microsoft-com:eek:ffice:eek:ffice" xmlns:msdt="uuid:C2F41010-65B3-11d1-A29F-00AA00C14882"
    on line 2 of your page
    then all else may be easier for you

    look at some of the errors you have and you see that a bot may/will/can faint and abort a crawl

    example - your below code:
    <a href="http://www.tkqlhce.com/click-1850606-1556312" target="_top" ><a href="http://www.kqzyfj.com/click-1850606-1556312" target="_blank" ><FONT color="#008080">www.sportsmansguide.com</FONT></a>

    in above you have a nested link - one link inside another link without anchor text ... and closing tag for only ONE link

    above results in MANY errors like
    "document type does not allow element "A" here."
    which means WRONG link code/tags
    for bots as well of course
    jut keep in mind that a bot is far smaller than any browser we use to SEE a page
    browsers have many error-compensating features
    bots are made for highest efficiency and some of the bots simply expect professiojnal quality to work properly
    ...

    HOWEVER - the most important factor on above A-tag error is that your errors occur INLINE above your very own links
    hence thanks to links to foreign sites that are invalid your own links to your additional pages may all together be skipped by sensitive bots such as Googlebot !!

    the fatality of an error depends on :

    where does a code error occur in INLINE yext - NOT in displayed text !

    if error before important text or links
    then all text/links after the error is skipped or lost

    you also may fully use the alt tags ... its for better SERPS if used properly


    4.
    Yahoo sitemap is different from Google sitemap
    >>> text file one url per line
    and MSN sitemap ?? no idea - but since M is far behind in development BUT making huge efforts in crawling largest amounts of visits to catch up - we may offer an additional HTML sitemap - with CORRECT a tags ;-)
    hence at least 3 versions of sitemaps - all 3 linked on the domain index page and each search engine takes what it can digest and process

    to finalize
    A.
    i am fully aware that fully and truly valid code may be a pain for many
    but actually it is easily possible for all
    it excludes ANY and every possible error on YOUR side and is under YOUR direct control
    i have no problems with such ( except for the forum software which creates its own mess )
    but blog and hand created pages all are repeatedly valkidated down to 0 errors again and again
    be aware when copy/pasting foreign code into a valid page you may screw up your code again
    this of course also is valid for code coming from "big" domains such as Google - they are too rich to need to worrry about quality !!
    validate - by hand - one by one until all done
    if your domain of any value to you - you may use a site creating generator that allows manual code verification/editing

    B.
    after all that done
    validate links and CSS
    then promote again all pages to have all bots RE-visit - also selfpromotion - just click the option resubmit

    C.
    one last factor that might help some to motivate for truly fully validated code
    - think future
    - think years ahead when making pages NOW

    NOW and since a few years already
    i see an increasing number of WAP devices or mobiles surfing my site
    RSS wap feeds as well as regular content
    just look at your agent-part of your access_lot statistics to see the number of mobiles surfing YOUR site
    ALL mobiles have browsers that usually are below 1 MB in size
    hence such mobiles only can display correctly when code clean
    some industries may attract more - others less mobiles as surfers
    in a very few years the number of mobile surfers may be ten or hundred fold
    and all those "defending" their errors for laziness or other reasons may either have to make up their sites
    or will be left behind ...

    fortunaltely - 2 years ago i had an intense experimenting phase with WAP devices - on my site as well as in all www using various mini/browsers such as doris, built in nokia 3650, or the mobile opera version and other smallest browsers

    a clean html/xhtml page always can be efficiently surfed by any of these mini/browsers
    while poor coded pages just create a mess and thus ay cause mobile surfers to leave

    NOW the number of mobile surfers might be small
    but it is fast increassing new customer-group for SOME industries or sites

    another point for motivation might be

    if ever for any reason you want to CONVERT your site-content into
    - another format - lets say from HTML to XHTL using a tool if such exists
    - content into a neat and clean PDF ebook using acrobat full version
    for example i do offer several ebooks of my content for free download and they are downloaded a few thousand times each months ! all these ebooks have been created with minimal additional preparatory work using acrobat for a full real ebook. of course i also could use the less nicely looking version as created by html2pdf tools in Linux
    i may even think about having some of these ebooks printed in Print on demand ... here again a fully valid code is a requirement to have easy work and to assure the pages printed is displayed as desired

    a clean conversion from ONE code to another depends on CLEAN validated code to start with
    a messy code simply fails to be cleanly converte and may be even more messy after conversion into the new code

    typical code conversions may be
    - html to xhtml
    - html to ps
    - html to pdf
    - html to text
    - pdf to html
    and others ...
    the cleaner to source - the cleaner the end result

    once you have the proper tools and learned the proper procedures to create and revalidate your pages
    there will be NO additional work at all but lots of additional FUTURE time-savings and work-savings as well as other benefits such as to NEVER have to worry about why a page is NOT indexed - if NOTR indexed - then you know that you have done ALL on YOUR side - and all that might be missing is a few links more or some more progressive promotional methods such as a.m. blog/RSS feed or cross linking from other pages to relevant chapters - all within your own domain

    have fun
     
    hans, Dec 4, 2005 IP
  8. northpointaiki

    northpointaiki Guest

    Messages:
    6,876
    Likes Received:
    187
    Best Answers:
    0
    Trophy Points:
    0
    #8
    Hans - thank you for the replies. Much to chew. I will go over again and ponder. When I say that the site is validated, I was referring to the google sitemaps page, and using the wrong word. The site is verified, which is what I thought you were talking about. Also since on my weblog, and one other source which now eludes me, on the error page, none were listed, I thought I was in the clear. I will look more deeply into what you laid out in such detail. Thanks.

    The invalid links - these are links to my affiliate advertisers - the code was prescribed by them. It's the way I get paid. I will raise some of the validation issues you raised with them.

    Many thanks again.

    Paul
     
    northpointaiki, Dec 5, 2005 IP
  9. hans

    hans Well-Known Member

    Messages:
    2,923
    Likes Received:
    126
    Best Answers:
    1
    Trophy Points:
    173
    #9
    re your

    that may happen again and again - even google provides sometimes wrong code ...
    code offered by other sites may be totally wrong and hence fully mess up your entire site

    either find out what the links really should be - it may be a copy and paste error from one among you
    or remove it until a clean code is provided by that affiliate site

    good luck and success in your work
     
    hans, Dec 5, 2005 IP
  10. northpointaiki

    northpointaiki Guest

    Messages:
    6,876
    Likes Received:
    187
    Best Answers:
    0
    Trophy Points:
    0
    #10
    Thanks, Hans.

    One curious thing, though. The report shows an error:

    Error Line 38, column 91: required attribute "ALT" not specified.

    ...slelogo.jpg" width="749" height="100"></TD>

    The attribute given above is required for an element that you've used, but you have omitted it. For instance, in most HTML and XHTML document types the "type" attribute is required on the "script" element and the "alt" attribute is required for the "img" element.

    Typical values for type are type="text/css" for <style> and type="text/javascript" for <script>.


    The problem is that my line 38 is nothing like what the validation tool said is on line 38 - the only thing there is a TD.

    I do have an includes, my logo, which has the line in question - but in that includes, on that line, what the validation tool describes as a missing attribute - the closing ">", is in fact there - it isn't missing. Am I missing something here?
     
    northpointaiki, Dec 5, 2005 IP
  11. hans

    hans Well-Known Member

    Messages:
    2,923
    Likes Received:
    126
    Best Answers:
    1
    Trophy Points:
    173
    #11
    YES

    1.
    if you are using includes
    then of course your own code offline looks different since it is shown without the includes ...
    hence th eline count offline is wrong

    if i look in my editor at the code as served to the browser - i.e. with all includes
    then i see of course that alt tag error exactly on "Line 38"

    height="100"> should read
    height="100" alt="your alternative text describing your pic or gif here">

    you may surf your page
    then in browser look at the source
    to see all including SSI

    2.
    as a general rule for solving errors always start top ( inline top ) errors first
    some of the early errors SOLVED may cause many - smetimes even all - subsequent errors to disappear

    for example if you have a missing closing tag
    (such as in your links wher a container-tag is missing) that may affect following tags to be read wrongly

    3.
    since you are working with includes you have to correct for the number of lines included when validating offline
    i do validation in Quanta plus offline and later in Friefox webdevel toolbar online
    online always shows full code incl SSI

    4
    re your affiliate link error
    have a look at the precise code they provide
    may be somehow 2 aff links got copied into each other and when removing the second a-tag the other link is correct and one anchor text with closing tag is found on that original site where you get your aff links
    ...
     
    hans, Dec 5, 2005 IP
  12. northpointaiki

    northpointaiki Guest

    Messages:
    6,876
    Likes Received:
    187
    Best Answers:
    0
    Trophy Points:
    0
    #12
    Thanks, Hans. Will go back through the site to work it out. YOu have been quite helpful.

    Paul
     
    northpointaiki, Dec 5, 2005 IP