LinuxQuestions.org
Review your favorite Linux distribution.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Slackware
User Name
Password
Slackware This Forum is for the discussion of Slackware Linux.

Notices


Reply
  Search this Thread
Old 06-04-2021, 11:38 AM   #1
aikempshall
Member
 
Registered: Nov 2003
Location: Bristol, Britain
Distribution: Slackware
Posts: 900

Rep: Reputation: 153Reputation: 153
S3cmd failed with Error Code 74 when uploading to Amazon S3


My backup to Amazon S3 started failing Tuesday morning. This is the command

Code:
 s3cmd sync --delete-removed --limit-rate=288K --multipart-chunk-size-mb=5 --no-progress LOCAL_DIR s3://BUCKET
Searched everywhere for an explanation/solution to this problem all I could come up with was a very terse

Quote:
An error occurred while doing I/O on some file.
After much playing with --verbose and --debug flags discovered that if I fed the data piecemeal directory by directory it would work. For some reason I then decided to double the very lowly transfer rate from 288K to 576K and it started to work.

Hope this helps someone.
 
Old 06-15-2021, 06:08 AM   #2
aikempshall
Member
 
Registered: Nov 2003
Location: Bristol, Britain
Distribution: Slackware
Posts: 900

Original Poster
Rep: Reputation: 153Reputation: 153
Well it failed again today for same reason.

In the eight or more years of running s3cmd never had this error.

Seemed to fix it after increasing the transfer rate. In the past had problems with the chunk rate, in the past had to decrease it.

Investigation continues.
 
Old 08-02-2021, 05:53 AM   #3
aikempshall
Member
 
Registered: Nov 2003
Location: Bristol, Britain
Distribution: Slackware
Posts: 900

Original Poster
Rep: Reputation: 153Reputation: 153
After more than a month with error free syncing with aws it failed again yesterday morning and again this morning

Quote:
Sun 1 Aug 09:59:55 BST 2021: I9033 - s3cmd sync --delete-removed --limit-rate=576K --multipart-chunk-size-mb=5 --no-progress /home/alex/FastStorage1/amazon_drive_encrypted/alex/Documents/ s3://mcmurchy1917-MyDocuments

Sun 1 Aug 10:06:14 BST 2021: E9030 - Failed sync with s3://mcmurchy1917-MyDocuments - errorcode 74
It always seems to fail at a point before it starts syncing any data. This morning it just sat there for 6 minutes silently doing something before failing. Presumably it's trying to work out what it needs to sync and then something timesout. On the last two failures it was trying to remove over 8,000 files from aws.

I've posted a query to the S3cmd mailing list. My post never got onto the list.

Anyway as I've said previously if I fed the data piecemeal directory by directory that's worked in the past when I get these 74 errors

I've also now found that if is use the awscli package, from SlackBuilds.org, instead of s3cmd package, from SlackBuilds.org, and change my call from

Code:
s3cmd sync $DRYRUN ${DEBUG_STRING} --delete-removed ${LIMITRATE} ${CHUNKSIZE} ${PROGRESS} myfolder s3://mybucket/myfolder
to

Code:
aws s3 sync --delete myfolder s3://mybucket/myfolder
It works!

I shall try again to post to the s3cmd forum mailing list, there's only been 3 postings this year which probably explains why my posting of a month or so ago got missed.

Last edited by aikempshall; 08-02-2021 at 06:11 AM. Reason: More information about what S3cmd was doing
 
Old 08-21-2021, 03:22 AM   #4
aikempshall
Member
 
Registered: Nov 2003
Location: Bristol, Britain
Distribution: Slackware
Posts: 900

Original Poster
Rep: Reputation: 153Reputation: 153
Posted a bug report with the people at s3cmd.

Also found another workaround for this "feature" which only raises an error when it's deleting loads of objects from a bucket. The ensures that the whole bucket isn't inadvertently emptied. There might be an undocumented limit of 1,000 objects and if it hits this it bombs out with the 74 error code.

My workaround was to -
  1. run the s3cmd with the --dry-run flag
  2. count the number of objects it wished to delete
  3. verify the count is correct
  4. add the flag --max-delete=${COUNT} where $COUNT is the number of objects to delete
  5. run the s3cmd without the --dry-run flag
 
1 members found this post helpful.
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] S3cmd failing when uploading to Amazon S3. aikempshall Slackware 5 11-19-2018 04:09 AM
s3cmd cp -r s3://bucket1/ s3://bucket2/ so slow it hurts Sum1 Linux - Software 4 11-25-2017 02:01 PM
LXer: Encrypted offsite backup with EncFS, Amazon S3, and s3cmd LXer Syndicated Linux News 1 07-31-2016 07:39 PM
Amazon Linux AMI(Amazon Machine Image) - ec2 server - query about pem file unclesamcrazy Linux - Newbie 11 11-27-2014 04:31 PM
Issue using s3-put utility while uploading data to s3 amazon server aurehman Linux - Software 0 02-25-2008 07:09 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Slackware

All times are GMT -5. The time now is 10:37 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration