Explain me, please, how does dir (not DMOZ!) works?

Discussion in 'Directories' started by mq778b, Oct 13, 2007.

  1. #1
    Only DMOZ has sensible downloadable list, which can be freely and widely used by application etc.

    Others must be tediously parsed or they even forbid such action (yahoo dir for example).

    Can somebody explain me why it is worth to be listed in dir
    which no search engine can use its listing and how it works!!??


    I can't imagine a webuser manually looks through web dirs (especially a small ones) to find something.

    Yahoo dir one paid to be listed in yahoo search, but what one get from a lot of small dirs, which has no their own SE?

    Thanks
     
    mq778b, Oct 13, 2007 IP
  2. pipes

    pipes Prominent Member

    Messages:
    12,766
    Likes Received:
    958
    Best Answers:
    0
    Trophy Points:
    360
    #2
    I would suggest possibly searching this particular section of the forum as recently these same questions have been both asked and answered in many ways.

    If your still looking for more info after that then myself and possibly some others will gladly inform even further.
     
    pipes, Oct 13, 2007 IP
  3. YMC

    YMC Well-Known Member

    Messages:
    2,787
    Likes Received:
    404
    Best Answers:
    4
    Trophy Points:
    190
    #3
    First off, welcome to the forum.

    Secondly, search engines make use of directories every day. The way directories categorize and hopefully list only those relevant and useful sites assists the search engines in determining what the listed sites are about based on the categories they appear in and the text used to link to them.

    The niche directories, which by their nature are usually smaller, do have people coming to their sites looking for information. Most niche directories have significantly more visitors than submitters as they provide a viable alternative to the search engines since most niche directories are kept tight to topic and don't list someone under the word 'Flowers' because the word appears in an H1 tag somewhere.

    As to small general directories, DMOZ and Yahoo did not start out the size they currently are today. That small, little known site may become tomorrow's DMOZ or disappear completely. In that aspect, submitting to directories is a risk - but the time invested is small and the overall benefits outweigh the time lost when a directory or two disappears.

    Like Pipes has already said, there are tons of threads in this section of the forum on this topic where you can find more detailed information.
     
    YMC, Oct 13, 2007 IP
    pipes likes this.
  4. workshop

    workshop Guest

    Messages:
    975
    Likes Received:
    62
    Best Answers:
    0
    Trophy Points:
    0
    #4
    Google uses directories amongst many other factors to determine who gets ranked where. It takes effort but web masters can get very valuable exposure in the serp's if they keep working at it over time. However you need to learn how to spot the ones that are going to last and the ones that are here today and gone tomorrow.
     
    workshop, Oct 13, 2007 IP
  5. mq778b

    mq778b Peon

    Messages:
    86
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #5
    The question was: why dirs (exclude DMOZ) don't provide listed sites for bulk download the same way as DMOZ (say, in xml format).
    I don't concern about SE efforts to scrap off their data (and bots to do intellectual job to cross site description with site URL).
    It would save their (dirs) badwidth and provide more quality data.
    Some of them even disable SE bots ( from robots.txt) and Yahoo dir even do not give to scrap (by disabling your IP)!
    This is a question: why the dirs collect data and after that don't give access to then data in bulk and/or in affordable way.
    I didn't ask about a usefullness of external links.
     
    mq778b, Oct 14, 2007 IP
  6. workshop

    workshop Guest

    Messages:
    975
    Likes Received:
    62
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Why would you want such a list? What use does it have?
     
    workshop, Oct 14, 2007 IP
  7. werewe12

    werewe12 Well-Known Member

    Messages:
    1,034
    Likes Received:
    66
    Best Answers:
    0
    Trophy Points:
    140
    #7
    It's called RSS feeds. Most directories have these feeds for this same purpose.

     
    werewe12, Oct 14, 2007 IP
  8. mq778b

    mq778b Peon

    Messages:
    86
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #8
    1. RSS feed feeds only few last insertions. There is no way to figure out who was phase out from dir. One should scrap rss feed few times a day during a year. ONLY new sites are listed in RSS, not the sites which pay for the next year (YAHOO).

    2. Small (special, topic oriented) SE will be happy to have simple access to Dir data. Interesting things can be sucked from from cross reference of various dirs.

    3. Another aspect: DMOZ requires backlink from every site using its data. And it has it from ... Google! Name me dir owner which does not dream about such backlink!
    That is a question!

    4. For example how to DMOZ data is used: DMOZ has special category
    wwwDOTdmozDOTorg/Computers/Internet/Searching/Directories/Open_Directory_Project/Use_of_ODP_Data/

    OR

    wwwDOTwe-globeDOTnet/WebLab/Dmoz/
     
    mq778b, Oct 15, 2007 IP
  9. workshop

    workshop Guest

    Messages:
    975
    Likes Received:
    62
    Best Answers:
    0
    Trophy Points:
    0
    #9
    Correct me if I am wrong but what you are suggesting is that directory scripts should be written with a feature that feeds a traditional search engine. Is this not what one of the members has proposed as a reason for setting up an association of "quality" directories?
     
    workshop, Oct 15, 2007 IP
  10. mq778b

    mq778b Peon

    Messages:
    86
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #10
    To my mind, SE should fetch FULL dir listing and not to be feed up by some junk extraction of dir. The quality of dir can be defined as number of backlinks from SE and other WEB applications based on dir listing. Dir script should be able swap full dir content on weekly or monthly (or some other) basis.
    Sofisticated SE and Web application would be able extract useful info from such listings by automatic cross reference of sites from different dirs.
    Again, not partial feed but full listing on some regular time basis makes sense and can boost dir popularity.
     
    mq778b, Oct 15, 2007 IP