Transferring 40000+ files with FTP help needed
I have 40000+ files in one directory and I need to FTP them to another machine. I cannot even get an ls of them unless I use file globbing
i.e ls file[12]* If I do an ls or an mget is shows 0 files. I can transfer the files like this by globbing using mget file[12]* but the file names are all very similar and have thousands of numbers associated with them, and so this method takes forever. The machine is a production server and I do not want to recompile the kernel to have more command-line argument memory. Is there a ftp program that can handle this many files or a script that I could use to ftp these files? |
i dont know about that quantity of files but you can use ftp's mget (or mput)
if you are ftp'ing TO the server to download the files from Code:
ftp -i server.with.4000.files again i have never tried to download that many files from a single ftp directory before so could not tell you that it will work in your particular situation, but worth a shot none-the-less hope this helps... |
Hi gvec,
The problem is that there is so many files that mget cannot cope and hangs or doesn't tranfer any files. From what I have read this is to do with the command-line memory that is compiled into the kernel. I have a work around solution that uses perl and the Net::FTP package. Perl doesn't have the memory constraint like the command line does so file globbing is no problem. I think this is the best solution as ftp programs use local commands such as 'ls' which don't work due to the number of files. |
The best method is to tar the files and transfer the data in a single stream. if you do not have enough free space you will probably have to use a file system to copy the files. the ftp protocol is not very good at transfering many files usually bailing out at a few thousand.
Though you might also have more luck with scp, but if it was me i'd just mount a nfs or smb directory and perform a simple copy. |
Tim1235:
You said "I have a work around solution that uses perl and the Net::FTP package. Perl doesn't have the memory constraint like the command line does so file globbing is no problem. How do you do it? Net:FTP doesn't implement "mget", but only "get". I am writing a script where I want to mget all files starting with a specific id, like: "mget 310*" Any ideas? |
You can use ''glob"
glob uses pattern matching operators like ? and * as well as multiple searches i.e glob "pattern1 patern2 pattern3" Code:
@fileList = glob "310* 320* etc"; Code:
for $file (@fileList) { |
All times are GMT -5. The time now is 09:53 AM. |