Get URL List (GUL) Downloader



So, here it is, another download manager. Why should You try it ? There are so many downloaders, it's hard to choose right one. We'll just enumerate GUL features, and it's up to You to make a decision. Firstly GUL was started to allow downloading of regularly changed URL's - like news pages, frequently refreshed price lists, xls, and alike resources, to automate the "updating" process - checking resource size and modification date/time on server, and download only if it's changed. But later some new features were added, so now "Get URL List" is a kind of multi-function "downloading swiss knife".
Let's assume, You want to get some news page every day.
The page You've just downloaded is full of unneeded banners, ads, counters,subsciption and search forms, menus, large headers and footers. So it would be great if "somebody" cut all this trash and make just a clean page containing information, exact what you want.

Sometimes You download pricelists every day, just to find out that it has not changed since last download, so it's wasted traffic and time.

You want to get multi-part archive file, for example: file.part001.rar, file.part002.rar, ... It would be nice to have automated procedure, that gets all these files one by one, so You just set "start" and "end" numbers and begin downloading process.

May be You are a "manual&docs" hunter, so when You find a "Contents" page with links to all "chapters", You would be glad to have proggy that downloads that "contents" and all referred pages for You, yet preserving cross-reference integrity between all the pages.

You want to get a huge file - hundreds of MegaBytes, but Your modem or xDSL line is quite slow, so downloaing will be ve-e-ery long pain. BTW, You've got many friends with internet access, so You could "distribute" your download with them, so You get part number one, Your friends will get the rest. When all done, You could gather all parts to the original file.

You have a home network with many shared resources (mp3 music files, video etc.) But when you start copying some huge file, it's owner turns off his computer, and process aborted. The next time You've got to start from the very beginning. Is there a way to resume broken copying ?

Get URL List (GUL) downloader makes all these wishes to come true.

GUL 3.0 main features

  • All downloads can be organized into unlimited number of lists
  • Downloading resources with "date" elements in address: http://someserver.com/news.cgi?year=2004&month=11&day=20
    GUL can grab such URLs for current date and/or for previous/next dates (getting "past days" news)
  • Series resources downloading. "Number" can be normal decimal, or hexadecimal, or just a alphabet letter ( file-A.zip, file-B.zip, ... file-Z.zip)
  • Download start conditions: alwais; only if local copy file not exist; and "Update mode" = file won'be re-downloaded until it's changed on server (modificatio date of file size has changed)
  • Three types of URLs supported: "http://", "ftp://" and "file://" (file:// means Your local NetBIOS lan shared resources
  • Resumed download, including NetBIOS local resources
  • Scheduler: starting desired URL List at specified time, with possibility to stop it at desired time (using sheap night traffic)
  • Multiple modes for downloading HTML pages: only the page itself, or with embedded objects (frames, CSS-files, graphics, JavaScript module files, Java applets, embedded Macromedia Flash movies, audio and video files - downloading or not of each data type controlled separately)
  • DOC-mode : the main page is a "contents", so when it's downloaded, all links grabbed from it and downloaded to the same folder. The depth of nested links can be adjusted. All cross-links between pages will be fixed, to keep cross-reference integrity.
  • Distributed download for Huge files ("split" big file to parts, send download requests to Your friends, and merge all downloaded/gathered parts to the original file)
  • Post-processing : calling external procedures after downloading
  • Automatic publishing downloaded files on FTP-server
  • Unlimited nunber of URL lists support, easy moving/copying downloads between lists
  • URL's exchange with friends by email
  • "GUL server mode" - GUL periodically check incoming mail and receive download requests thnat were sent from other GUL programs (used in distributed downloads exchange etc.)
  • Multi-language interface, easy self-localyzing (just by editing text "language" files
  • Multi-threaded downloading
  • Post-scanning and converting downloaded HTML/TXT pages: searching and replacing or cutting unneded text blocks (banner/ads cut, design switch, deleting big html headers, javascript blocks and so on), by three engines: Search-Replace, BanneRipper (standard banner block ripping), SiteRipper (ripping common blocks that exist on every page from this site)

Download


back | main page

CopyRight © AS-Works, 1998-2005
Hosted by uCoz