Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
This is my first question on the forum. Apologies if i have not posed question in the right category.
I have this sftp script using except to download files. The download stalls after downloading few files and exits. The number of files are around 6 and each file is not bigger than 150 MB. However when i manually download the file i do not face this issue.
Could someone tell if you notice any issue with the below script?
The timeout parameters at our end and client end are 30 minutes. I do not think the timeout as an issue. There are no threshold settings on datasize or number of files.
Hi ,
This is my first question on the forum. Apologies if i have not posed question in the right category.
I have this sftp script using except to download files. The download stalls after downloading few files and exits. The number of files are around 6 and each file is not bigger than 150 MB. However when i manually download the file i do not face this issue.
Could someone tell if you notice any issue with the below script?
The timeout parameters at our end and client end are 30 minutes. I do not think the timeout as an issue. There are no threshold settings on datasize or number of files.
I see no problems with the script itself, but since you're using SFTP, why are you bothering with expect? The whole purpose of SFTP is to be SECURE...putting names/passwords into a script file that anyone can read negates that security. Read the man page on the sftp command...pay particular attention to the "-b" (batch) flag, which may help you.
That said, you really don't need to be doing an "ls -l" in a script file, since it's pointless, when you're doing a 'get' later on SPECIFIC files. Why not use "mget *20131231*", to get MULTIPLE files with one statement? And since you can pass the host/user/password on the command line when you initiate the SFTP connection (again, see the man page), you're really going the long way around.
TB0ne thanks for the reply.
I believe there is no way to specify password using -b option. -b works fine for passwordless sftp accounts but not sure how to specify password when using -b option. I googled it and did not find any. Let me know if you are successful in using sftp -b with password.
TB0ne thanks for the reply.
I believe there is no way to specify password using -b option. -b works fine for passwordless sftp accounts but not sure how to specify password when using -b option. I googled it and did not find any. Let me know if you are successful in using sftp -b with password.
You can specify it when prompted for it. Again, putting a password into a script file is a BAD IDEA....and totally negates ANY security that using a secure protocol gives you.
You can do a simple ssh keyswap, and not NEED to put a password in at all...not only is it simpler to script for, but MUCH more secure.
TB0ne, I agree what you say. The client is in favour of password usage,We did ask them about key validation but did not agree.The job runs from cron so manual entering password is ruled out. I might have found the problem for downloading stalling. something to do with one of the proxy on the loadbalancer.Will update soon
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.