bash script

Discussion in 'Programming' started by ssimon171078, Jul 31, 2014.

  1. #1
    i need some help :i want to grep all links from website with wget:
    get --spider --force-html -r -l1 somewebsite i need to save only urls without js,img.. to text file
     
    ssimon171078, Jul 31, 2014 IP
  2. Marcel Preda

    Marcel Preda Member

    Messages:
    21
    Likes Received:
    1
    Best Answers:
    1
    Trophy Points:
    43
    #2
    Hi there,

    I think that something like next line will work
    wget -r -l 1 --spider --force-html yoursite 2>&1 | grep saved | awk '{print $6}' | grep -iv 'if\.js' | grep -iv '\.jpg' | grep -iv '\.gif'

    Best Regards,
    Marcel Preda
     
    Marcel Preda, Aug 1, 2014 IP