LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Slackware (https://www.linuxquestions.org/questions/slackware-14/)
-   -   S3cmd failed with Error Code 74 when uploading to Amazon S3 (https://www.linuxquestions.org/questions/slackware-14/s3cmd-failed-with-error-code-74-when-uploading-to-amazon-s3-4175696006/)

aikempshall 06-04-2021 11:38 AM

S3cmd failed with Error Code 74 when uploading to Amazon S3
 
My backup to Amazon S3 started failing Tuesday morning. This is the command

Code:

s3cmd sync --delete-removed --limit-rate=288K --multipart-chunk-size-mb=5 --no-progress LOCAL_DIR s3://BUCKET
Searched everywhere for an explanation/solution to this problem all I could come up with was a very terse

Quote:

An error occurred while doing I/O on some file.
After much playing with --verbose and --debug flags discovered that if I fed the data piecemeal directory by directory it would work. For some reason I then decided to double the very lowly transfer rate from 288K to 576K and it started to work.

Hope this helps someone.

aikempshall 06-15-2021 06:08 AM

Well it failed again today for same reason.

In the eight or more years of running s3cmd never had this error.

Seemed to fix it after increasing the transfer rate. In the past had problems with the chunk rate, in the past had to decrease it.

Investigation continues.

aikempshall 08-02-2021 05:53 AM

After more than a month with error free syncing with aws it failed again yesterday morning and again this morning

Quote:

Sun 1 Aug 09:59:55 BST 2021: I9033 - s3cmd sync --delete-removed --limit-rate=576K --multipart-chunk-size-mb=5 --no-progress /home/alex/FastStorage1/amazon_drive_encrypted/alex/Documents/ s3://mcmurchy1917-MyDocuments

Sun 1 Aug 10:06:14 BST 2021: E9030 - Failed sync with s3://mcmurchy1917-MyDocuments - errorcode 74
It always seems to fail at a point before it starts syncing any data. This morning it just sat there for 6 minutes silently doing something before failing. Presumably it's trying to work out what it needs to sync and then something timesout. On the last two failures it was trying to remove over 8,000 files from aws.

I've posted a query to the S3cmd mailing list. My post never got onto the list.

Anyway as I've said previously if I fed the data piecemeal directory by directory that's worked in the past when I get these 74 errors

I've also now found that if is use the awscli package, from SlackBuilds.org, instead of s3cmd package, from SlackBuilds.org, and change my call from

Code:

s3cmd sync $DRYRUN ${DEBUG_STRING} --delete-removed ${LIMITRATE} ${CHUNKSIZE} ${PROGRESS} myfolder s3://mybucket/myfolder
to

Code:

aws s3 sync --delete myfolder s3://mybucket/myfolder
It works!

I shall try again to post to the s3cmd forum mailing list, there's only been 3 postings this year which probably explains why my posting of a month or so ago got missed.

aikempshall 08-21-2021 03:22 AM

Posted a bug report with the people at s3cmd.

Also found another workaround for this "feature" which only raises an error when it's deleting loads of objects from a bucket. The ensures that the whole bucket isn't inadvertently emptied. There might be an undocumented limit of 1,000 objects and if it hits this it bombs out with the 74 error code.

My workaround was to -
  1. run the s3cmd with the --dry-run flag
  2. count the number of objects it wished to delete
  3. verify the count is correct
  4. add the flag --max-delete=${COUNT} where $COUNT is the number of objects to delete
  5. run the s3cmd without the --dry-run flag


All times are GMT -5. The time now is 11:34 AM.