Ok youre probably thinking that I am a filthy stealer. Well I want to know how to rip websites quickly (of course i could save individual images etc but that takes too long). The reason I want to do this is so that using the site I can learn how they use their css to produce their result. I am not in anway planning to re-upload or anything like that just for pure learning uses. Thanks.
In explorer just do a save page but change the option to save page complete. It will make a folder of all the images and css files along with the page. I guess this is for 1 page not the whole website...but still can give you an idea on how the css works...
Here's a free program that should do what you want: http://www.httrack.com/page/1/en/index.html Hope this helps, Paul
If you are using firefox you can right click and click View Page Source to view the coding of the site.
One of the most efficient ways to rip websites is by using the program HTTrack which might look a little bit confusing at the beginning because of its many options. I would like to walk you through the process of ripping a website..........
Easy solution that I have used in the past. It is called "Backstreet Browser". Put in the web address put in the depth of the website you want to dig to, and let it Rip. Not all websites will work with it, but most will. http://www.spadixbd.com/backstreet/