Scenario: you've found a really good article but it's spread across numerous webpages. The website looks a bit neglected. You're not sure it's going to exist forever. But you'd like to be able to access this article and to be able to refer to it for the foreseeable future.
Solution: download the article and its linked pages.
Warning
As ever, I take no responsibility for outside websites, or what installing software in the way described by these sites might do to your system. The risk is all yours!
All I can write is that these are steps that I've followed myself and they worked without any visible damage.
Step 1: Install wget
If you have wget installed already, you're good to go, if not then follow this link to install the software. (If you're not sure then attempting Step 2, you'll find out if the program is missing and will be able to return to this first step.)The official wget site is here if you want to look at it more closely.
Note: this blogpost is going to cover work that is all done from the Terminal. If you're uncomfortable with this, it's time to back out now.
Step 2: Create a subfolder and download webpages
- In Finder create a folder in a location where you want to save the webpages
- Open Terminal, type cd and then a space and drag the folder from Finder to Terminal so that the location appears after cd
- Press enter
Step 3: Get ready, Go!
- I'd recommend testing the results with a simple webpage first, to see that the results that are returned are what you expect (you don't want to download gigabyte after gigabyte!)
- With this in mind follow these instructions but ignore the install information, you've already done that
- Wait for the pages to download
Comments
Post a Comment