Copying links from an entire website

Discussion in 'Programming' started by Alpha13, Sep 17, 2009.

  1. #1
    Does anyone know of a program or browser extension that will copy ALL internal and external links from an entire website (not just a single page)?

    If I can't find one I am thing of making a custom script, what language would I use for this?

    Thanks
     
    Alpha13, Sep 17, 2009 IP
  2. Rikesh

    Rikesh Active Member

    Messages:
    139
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    63
    #2
    You can use softwares like offline explorer to download the contents.. But I don't know if there's an option for the links only..
     
    Rikesh, Sep 17, 2009 IP
  3. ohteddy

    ohteddy Member

    Messages:
    128
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    28
    #3
    I use tagsoup for these sorts of tasks. Here's an example of how to
    get all the links from html page using rubyful soup:

    soup = BeautifulSoup.new(html_input)
    links = soup.find_all('a')
    p links
     
    ohteddy, Sep 18, 2009 IP