MDFS::Apps.WebTools Search

WebTools

TitleVersionDateFilesSizeDescription

Fetching (downloading) an entire internet/web site.

I had uploaded a batch of files to my internet site without keeping trackof what was actually on it. As my FTP link was rather slow (standardtelephone modem) and I had access to a high-speed HTTP link, I decidedthat the easiest way to find out exactly what was on my site was to findsome software to grab the whole site. After trawling through searchengines and tracking down software, this is what I found. I tried to getit working, and these comments are the result.
WebGet     
[{WebGrab} broken link]    Pros: unknown
Cons: Demo version doesn't download enough files to get my entire site.Directories are downloaded as dirname/index.html - there's no way tospecify that it should be saved as dirname/index.htm.
[Web Spider(broken link)]    Pros: file extensions can be changed to htm.
Cons: unregistered version claims to only downloads about 10 files. However I couldn't actually get it to download anything.
[Web Copier]    Pros: very configurable
Cons: couldn't get working as it kept on crashing just as soon as I askedit to start downloading.
[SuperHTTP]    Pros: Can drag from a browser? Documentation makes it appear to be exactlywhat I need.
Cons: need ftp access to get software, so I can't actually test it.
[Teleport Pro]    Pros: unknown, can't get it working.
Cons: appears to need a java engine, which is probably why I can't get itworking.
[Copernic]    Pros: unknown, can't find where to download it.
Cons: -
[PageSucker]    Pros: extention mapping completely editable.
Cons: HUGELY slow. Possibly because it's written in Java. As such, itneeds a Java interpreter.

Checking (validating) links between pages within a site.

[Xenu]     
[LinkCheck]     

Checking (validating) HTML within web documents.

[CheckWeb]     
[Validator]     

Other tools

UpdateHTML     

HOME  DOCUMENTS  LINKS  RECENT CHANGES  BROWSE FILES
Hosted by Force9 Internet- Authored by J.G.Harston- Last update: 17-Nov-2002