LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Slackware
User Name
Password
Slackware This Forum is for the discussion of Slackware Linux.

Notices


Reply
  Search this Thread
Old 11-15-2018, 08:10 AM   #1
aikempshall
Member
 
Registered: Nov 2003
Location: Bristol, Britain
Distribution: Slackware
Posts: 900

Rep: Reputation: 153Reputation: 153
S3cmd failing when uploading to Amazon S3.


For a number of years I've been using s3cmd, from SlackBuild.org, to upload to Amazon S3. The other night it failed, I reran the command and it got a bit further. Reran the command and it got a bit further.

The failure message I get is

Quote:
ERROR:
Upload of '/mnt/southsea/amazon_drive_encrypted/alex/.thunderbird/pxgvw4yz.default/global-messages-db.sqlite' part 4 failed. Use
/usr/bin/s3cmd abortmp s3://mcmurchy1917-thunderbird/pxgvw4yz.default/global-messages-db.sqlite DQ35UbS6CFcwaHk97RuT7oEIYp6U.iA6XLdIzePSQHgMQ76hBhjGMKP_D9YbCPy4lemnYdscfdepGgvA8hi87g--
to abort the upload, or
/usr/bin/s3cmd --upload-id DQ35UbS6CFcwaHk97RuT7oEIYp6U.iA6XLdIzePSQHgMQ76hBhjGMKP_D9YbCPy4lemnYdscfdepGgvA8hi87g-- put ...
to continue the upload.
ERROR: S3 error: 400 (RequestTimeout): Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.

First question is should I have, as the error suggested, run the command

Code:
/usr/bin/s3cmd abortmp s3://mcmurchy1917-thunderbird/pxgvw4yz.default/global-messages-db.sqlite DQ35UbS6CFcwaHk97RuT7oEIYp6U.iA6XLdIzePSQHgMQ76hBhjGMKP_D9YbCPy4lemnYdscfdepGgvA8hi87g--
to abort the upload. Then tried again.

Needless to say, I didn't run the command. What damage, if any, have I caused?

Does the error message

Quote:
ERROR: S3 error: 400 (RequestTimeout): Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.
mean that I've got a problem with my internet connection which might fail bit time sometime in the future.

Or should I just wait and see if the problem goes away!!!!

Should I be addressing this issue on another forum?

Any suggestions.

Alex
 
Old 11-15-2018, 07:33 PM   #2
Habitual
LQ Veteran
 
Registered: Jan 2011
Location: Abingdon, VA
Distribution: Catalina
Posts: 9,374
Blog Entries: 37

Rep: Reputation: Disabled
Quote:
Originally Posted by aikempshall View Post
I reran the command and it got a bit further. Reran the command and it got a bit further.

Any suggestions.

Alex
Hey Alex:
How about "the command"?

I have a fuse.s3fs mount that we use and I had to set a cron (every minute) to get around something similar.
Code:
## Nov. 1st, 2014
### Need this to umount and remount /c9backups
### Error is "Transport end not connected"
See also https://status.aws.amazon.com when such an event occurs.

Last edited by Habitual; 11-15-2018 at 07:37 PM.
 
Old 11-16-2018, 12:41 AM   #3
aikempshall
Member
 
Registered: Nov 2003
Location: Bristol, Britain
Distribution: Slackware
Posts: 900

Original Poster
Rep: Reputation: 153Reputation: 153
Quote:
Hey Alex:
How about "the command"?
Sorry, that would have been useful!

Code:
s3cmd sync --delete-removed --limit-rate=36k /mnt/southsea/amazon_drive_encrypted/alex/Documents/ s3://mcmurchy1917-MyDocuments
It would appear that the only files that it's failing on is large files where s3cmd is splitting them.

So going to try
  1. increasing the limit-rate to see if speed is a factor
  2. reducing the multipart-chunk-size-mb, at the moment I use the default of 15MB
 
Old 11-16-2018, 05:51 AM   #4
aikempshall
Member
 
Registered: Nov 2003
Location: Bristol, Britain
Distribution: Slackware
Posts: 900

Original Poster
Rep: Reputation: 153Reputation: 153
Eventually completed all the outstanding uploads to Amazon S3.

Always seemed to fail when uploading a file greater than 15MB, but not always. The S3cmd splits these files into 15MB chunks. Always seemed to work on the second attempt, so I could always move forward.

Anyway I
  1. increased limit rate to as fast it would go - made no difference.
  2. decreased the multipart-chunk-size-mb to 5MB - all the uploads completed successfully. This doesn't mean this fixed the problem as the uploads may have succeeded anyway.

I shall do another upload of my system over the weekend and see what happens.
 
Old 11-16-2018, 08:05 AM   #5
Habitual
LQ Veteran
 
Registered: Jan 2011
Location: Abingdon, VA
Distribution: Catalina
Posts: 9,374
Blog Entries: 37

Rep: Reputation: Disabled
think I had to use s3cmd put for mine to finish that job.
Been a while.
 
Old 11-19-2018, 04:09 AM   #6
aikempshall
Member
 
Registered: Nov 2003
Location: Bristol, Britain
Distribution: Slackware
Posts: 900

Original Poster
Rep: Reputation: 153Reputation: 153
I've now included --multipart-chunk-size-mb=5 in my s3cmd settings and the uploads now complete successfully. I've asked the s3tools-general list at sourceforge and their recommendation was to

Quote:
it's best to split a large file up into whatever chunk size you can upload to S3 in 1 minute to avoid that socket time out error.
Following the above change my system, network, --multipart-chunk-size-mb=5 and --limit-rate=36k is uploading a chunk in 140s without failure. I know that 140s is a lot more time than 1 minute, but let's live dangerously and if it comes back and fails again I know what to change.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
s3cmd cp -r s3://bucket1/ s3://bucket2/ so slow it hurts Sum1 Linux - Software 4 11-25-2017 02:01 PM
LXer: Encrypted offsite backup with EncFS, Amazon S3, and s3cmd LXer Syndicated Linux News 1 07-31-2016 07:39 PM
LXer: Commercials for Amazon’s crappy phone in Amazon Prime videos? LXer Syndicated Linux News 0 12-09-2014 12:31 AM
Amazon Linux AMI(Amazon Machine Image) - ec2 server - query about pem file unclesamcrazy Linux - Newbie 11 11-27-2014 04:31 PM
Issue using s3-put utility while uploading data to s3 amazon server aurehman Linux - Software 0 02-25-2008 07:09 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Slackware

All times are GMT -5. The time now is 08:37 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration