LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 08-11-2004, 01:50 AM   #1
tim1235
Member
 
Registered: Aug 2004
Location: Melbourne, Australia
Distribution: fc5/Gentoo
Posts: 57

Rep: Reputation: 15
Thumbs down Transferring 40000+ files with FTP help needed


I have 40000+ files in one directory and I need to FTP them to another machine. I cannot even get an ls of them unless I use file globbing

i.e ls file[12]*

If I do an ls or an mget is shows 0 files. I can transfer the files like this by globbing using mget file[12]* but the file names are all very similar and have thousands of numbers associated with them, and so this method takes forever. The machine is a production server and I do not want to recompile the kernel to have more command-line argument memory. Is there a ftp program that can handle this many files or a script that I could use to ftp these files?

Last edited by tim1235; 08-11-2004 at 08:55 PM.
 
Old 09-11-2004, 11:01 AM   #2
gvec
Member
 
Registered: Aug 2004
Posts: 32

Rep: Reputation: 15
i dont know about that quantity of files but you can use ftp's mget (or mput)

if you are ftp'ing TO the server to download the files from
Code:
ftp -i server.with.4000.files
then once in the directory with all the files, use mget to download them all without interaction (the ftp -i turns off prompting when mget/mput multiple files so that it will run down the list without asking you to confirm that you want to get each file

again i have never tried to download that many files from a single ftp directory before so could not tell you that it will work in your particular situation, but worth a shot none-the-less

hope this helps...
 
Old 09-13-2004, 02:34 AM   #3
tim1235
Member
 
Registered: Aug 2004
Location: Melbourne, Australia
Distribution: fc5/Gentoo
Posts: 57

Original Poster
Rep: Reputation: 15
Hi gvec,

The problem is that there is so many files that mget cannot cope and hangs or doesn't tranfer any files. From what I have read this is to do with the command-line memory that is compiled into the kernel.

I have a work around solution that uses perl and the Net::FTP package. Perl doesn't have the memory constraint like the command line does so file globbing is no problem.

I think this is the best solution as ftp programs use local commands such as 'ls' which don't work due to the number of files.
 
Old 09-13-2004, 07:51 AM   #4
leckie
Member
 
Registered: Dec 2003
Location: Australia
Distribution: Mandrake 9.2
Posts: 151

Rep: Reputation: 30
The best method is to tar the files and transfer the data in a single stream. if you do not have enough free space you will probably have to use a file system to copy the files. the ftp protocol is not very good at transfering many files usually bailing out at a few thousand.
Though you might also have more luck with scp, but if it was me i'd just mount a nfs or smb directory and perform a simple copy.
 
Old 10-15-2004, 01:59 PM   #5
SeenaStyle
LQ Newbie
 
Registered: Oct 2004
Posts: 1

Rep: Reputation: 0
Tim1235:
You said "I have a work around solution that uses perl and the Net::FTP package. Perl doesn't have the memory constraint like the command line does so file globbing is no problem.

How do you do it? Net:FTP doesn't implement "mget", but only "get".

I am writing a script where I want to mget all files starting with a specific id, like:
"mget 310*"

Any ideas?
 
Old 10-17-2004, 06:06 PM   #6
tim1235
Member
 
Registered: Aug 2004
Location: Melbourne, Australia
Distribution: fc5/Gentoo
Posts: 57

Original Poster
Rep: Reputation: 15
You can use ''glob"

glob uses pattern matching operators like ? and * as well as multiple searches i.e glob "pattern1 patern2 pattern3"

Code:
@fileList = glob "310* 320* etc";
Then you can loop through this array of files and use get or put.

Code:
for $file (@fileList) {
  put $file /some/dir/
}
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
transferring files to other disk.. bschiett Linux - Newbie 3 03-24-2005 07:04 PM
IO Error when transferring files. mickey_kamer Linux - General 4 03-19-2004 09:29 AM
ftp on the internet shows no files tuxandme Linux - Networking 8 11-14-2003 04:24 AM
transferring files across a lan broxys Linux - Networking 2 06-23-2003 08:26 PM
Transferring Files From VMware? Crickit Linux - Software 5 04-17-2003 02:56 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 07:02 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration