tar problems in a cron job
hello,
i have some cron-jobs to make small backups with tar. if i test them manually, they run perfectly everytime. but if the cron job starts it don't works. the created archive is to small, and if i type "tar tvf filename" i will see some files in the archive and then: "tar: Unexpected EOF in archive tar: Error is not recoverable: exiting now" Has someone an idea? :-S best regards toredo |
To backup our files you can use "cpio"
|
Hi Toredo,
here are some thoughts... 1. Are the files accessible from the specified cron user? Eg. correct file- and directory permissions; Is the file system mounted and accessible for the cron user? 2. How big is the tar-file and what type of file system are you using? Some file systems have a pretty low file size limit. Eg: FAT16 has a file size limit of 2GB and FAT32 has it's limit at 4GB. 3. What is the output of: Code:
ulimit -a Hope this helps. Best regards, Tim |
Hello toredo,
Your title is rather indefinite and doesn't provide an accurate synopsis of the problem. A descriptive title is always beneficial in the long run; it helps you get more relevant replies faster and it allows other users to find it with the search function. Your description is OK, but it would improve the readability of your post if you chose to use correct spelling, punctuation and grammar. Please see this tutorial for more details. By the very nature of the error, one would assume that the file in question may be corrupt, or it is not really compressed. Wikipedia provides proof of this concept. Use:- Code:
file /path/to/file Remember that a simple Google search often will yield some insight to any particular problem. |
Redirect the output of stderr to a file. Study the output for error messages. Writes to stdout or reads from stdin will be blocked in a cron job which could account for a partially complete archive. Make sure you define any environmental variables and PATH entries you need. Also post your cron job. More eyeballs may be able to spot something you missed.
|
Quote:
Quote:
1)the last times i tried to run it as root, but there was the same error. 2)the resulting file is ~30mb, the source and the target filesystems are both ext3. 3)this is the output from "ulimit -a": Code:
core file size (blocks, -c) 0 Quote:
ok, i used the file-command: Code:
10-03-16.tar.bz: bzip2 compressed data, block size = 900k ok, i use the google-link to search again. i already searched, but may be i used the wrong key words. Quote:
20 3 * * * root /root/scripts/backup.sh /home/vss vss 20 backup.sh creates a backup archive on a defined path. the number on the right means the maximal number of backup-archives for "/home/vss". best regards toredo |
Hello again toredo,
Have you tried the command:- Quote:
|
All times are GMT -5. The time now is 06:59 AM. |