Recursive directory listing over http or ftp
I'm trying find out if there is way to get a recursive directory listing over http, ftp or other protocols from websites where I cannot logon.
I've looked at some lynx commands using -crawl and -traversal which do what I want -sort of. I'm trying to implement a search function which can return the dir contents of mostly ftp and http download sites where I am unable to use ssh or other authenticated login.
Does anyone have any ideas for doing this.
|