DOS programs to process files captured from the WWW. ==================================================== HTM-CLR.EXE This program will read a file that you have captured and clean out all of the HTML formatting commands. Leaves just the text of the file. Prompts for the names of the files to read and write, then does its magic. GET-HTTP.EXE This program extracts the HTTP:// addresses from a file that you have captured from the Web. I use this to clean up the files I save from Yahoo or Webcrawler to make the URLs more legible. This program leaves anything that is inside quotation marks so you get a little extraneous info' but it is easily edited out. Both programs work the same way. They prompt for an input filename, then for the filename of a file to write the results to. They will not overwrite an existing file. These files are FREE! Use and abuse as you wish, just do not sell or distribute on disk or CD-ROM without my permission (which I will probably grant for a free sample of the work). If you put them on a BBS, FTP directory or Web page, you must leave the 3 files intact and unmodified, complete with copyright notices. Shoot me the location of where you are posting via e-mail just as a courtesy. Drop in to my Web page at: http://users.aol.com/jorman/ See the other software and graphics that I have available, especially if you are a musician, electronics or audio buff, or have an interest in Fractals. Copr. 1995 by Jack A. Orman (jorman@megaweb.com)