LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (https://www.linuxquestions.org/questions/linux-software-2/)
-   -   Transferring 40000+ files with FTP (mget) shows 0 files (https://www.linuxquestions.org/questions/linux-software-2/transferring-40000-files-with-ftp-mget-shows-0-files-216080/)

tim1235 08-11-2004 01:50 AM

Transferring 40000+ files with FTP help needed
 
I have 40000+ files in one directory and I need to FTP them to another machine. I cannot even get an ls of them unless I use file globbing

i.e ls file[12]*

If I do an ls or an mget is shows 0 files. I can transfer the files like this by globbing using mget file[12]* but the file names are all very similar and have thousands of numbers associated with them, and so this method takes forever. The machine is a production server and I do not want to recompile the kernel to have more command-line argument memory. Is there a ftp program that can handle this many files or a script that I could use to ftp these files?

gvec 09-11-2004 11:01 AM

i dont know about that quantity of files but you can use ftp's mget (or mput)

if you are ftp'ing TO the server to download the files from
Code:

ftp -i server.with.4000.files
then once in the directory with all the files, use mget to download them all without interaction (the ftp -i turns off prompting when mget/mput multiple files so that it will run down the list without asking you to confirm that you want to get each file

again i have never tried to download that many files from a single ftp directory before so could not tell you that it will work in your particular situation, but worth a shot none-the-less

hope this helps...

tim1235 09-13-2004 02:34 AM

Hi gvec,

The problem is that there is so many files that mget cannot cope and hangs or doesn't tranfer any files. From what I have read this is to do with the command-line memory that is compiled into the kernel.

I have a work around solution that uses perl and the Net::FTP package. Perl doesn't have the memory constraint like the command line does so file globbing is no problem.

I think this is the best solution as ftp programs use local commands such as 'ls' which don't work due to the number of files.

leckie 09-13-2004 07:51 AM

The best method is to tar the files and transfer the data in a single stream. if you do not have enough free space you will probably have to use a file system to copy the files. the ftp protocol is not very good at transfering many files usually bailing out at a few thousand.
Though you might also have more luck with scp, but if it was me i'd just mount a nfs or smb directory and perform a simple copy.

SeenaStyle 10-15-2004 01:59 PM

Tim1235:
You said "I have a work around solution that uses perl and the Net::FTP package. Perl doesn't have the memory constraint like the command line does so file globbing is no problem.

How do you do it? Net:FTP doesn't implement "mget", but only "get".

I am writing a script where I want to mget all files starting with a specific id, like:
"mget 310*"

Any ideas?

tim1235 10-17-2004 06:06 PM

You can use ''glob"

glob uses pattern matching operators like ? and * as well as multiple searches i.e glob "pattern1 patern2 pattern3"

Code:

@fileList = glob "310* 320* etc";
Then you can loop through this array of files and use get or put.

Code:

for $file (@fileList) {
  put $file /some/dir/
}



All times are GMT -5. The time now is 09:53 AM.