I frequently find myself wanting to look at a few web pages while I’m w/o Net access, for example, when I’m on a plane. I’m heading out on a three-day trip tomorrow and so I quickly hacked together a tool I can use in situations like this using wget and ActiveWords.
I designated a folder where web site downloads will go. I wrote a simple batch command to invoke wget with the right parameter to download a URL and the pages immediately associated with it into that folder. Then I created a simple ActiveWord command to launch the batch file and pass it a URL parameter.
The batch file is called wget_page.bat.
cd "C:\Documents and Settings\ssimeonov\My Documents\spider_downloads"
c:\dev\bin\wget\wget --recursive --level=1 --page-requisites --convert-links --html-extension %1
I called my ActiveWords command wget. Create a new script command and add the code below to it.
Instead of ActiveWords, you can use Launchy to kick off the batch file. You will also have to tweak the code to add your own paths, etc.
Hi, looks useful, however why not just wget the whole site, then run it locally: not sure I understand the usefulness of ActiveWords.
Misha, ActiveWords make this simple. I don’t have to open a shell, update paths, type the name of a batch file, etc.
http://www.webaroo.com/ seems to do that out of the box. Although you would miss out on the hacking part.