Subdomain - Depth from folder root and sitemaps

Discussion in 'Search Engine Optimization' started by Ganceann, Nov 19, 2006.

  1. #1
    Hi,

    A couple of queries regarding subdomains and the depth from the root folder.

    It is suggested that we try and keep things no more than a depth of 2 links - however many people have multiple depths, especially those with directory structures.

    In effect the www is a subdomain so this is my question:

    Site info:
    Scenario B:
    example.com (depth = root)
    example.com/category/example.html (depth = 1)
    example.com/category/subcategory/example.html (depth = 2)

    In scenario A the depth is 3 deep, which is above the reccommended 2 deep link structure. Can this therefore make it more likely that the items on the 3rd depth link appear mainly as supplemental due to the existence of the www subdomain?

    For scenario B, would it be better using scenario B compared to scenario A to maintain the maximum 2 depth link structure.

    In essence it is the www vs non-www question. However, from an SEO point of view the search engines see the www as a subdomain and therefore add an extra depth to the link structure.

    For sitemaps:
    Given the sitemaps.org protocol and the longer term implications for using sitemaps - from an SEO point of view is there any sitemap scripts that actually read the last change date from the server (and input it correctly into the last mod field?).

    I have checked a couple and even selected to read from server it returned the date the sitemap was generated for every page.

    In addition, for sitemaps and SEO, obviously most content will not really change often after it has been submitted - although comments on blogs etc will mean a page needs re-indexed and that the content has effectively changed or been modified.

    In this instance, setting the change frequency to monthly or weekly by default in the sitemap would say that the content changes on that basis - however, the last modified date is still the issue I am looking at and how to effectively have a sitemap script that will accurately read changes for that time and that doesn't need to have every individual page altered manually if a change has occurred.

    Any sitemaps that effectively do this?

    Thanks.
     
    Ganceann, Nov 19, 2006 IP
  2. seobrien

    seobrien Peon

    Messages:
    222
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Anyone seen conclusive studies of folder depth in URLs. I just can't properly structure some sites without 3 folders:
    domain.com/location/category/theme/page

    I'm prominent on Google but not at all on Yahoo. Is this killing me?

    Thanks
     
    seobrien, Jun 22, 2007 IP
  3. catanich

    catanich Peon

    Messages:
    1,921
    Likes Received:
    40
    Best Answers:
    0
    Trophy Points:
    0
    #3
    From a site map standpoint, there is no depth limit. I have gone to 11 deep without problems.

    The issue is when Google calculates Page Rank. They seem to stop at level 4 but I have never tested this. I have PR assigned at the 4th level but never below that.
     
    catanich, Oct 25, 2007 IP