RSync error syncing over ssh driving me crazy
Hey Everyone,
I have a development server (remote), and a testing server on my local network. I'm trying to set it up so that files are synced from the development server to my local testing server so I can edit and make modifications (which I will then sync back to the development server when changes are stable - this part I'm not focusing on in this post). So I setup rsync to work over ssh with a certificate and that part works fine. The only thing on the development server are HTML/PHP files and really not THAT many nor are there any large sized files. The rsync cmd I use: rsync -rlptvv --delete -e "ssh -i /root/rsync/mirror-rsync-key" [user]@[domain]:/home/[user]/public_html/ /var/www/ It connects fine, copies/checks files and after maybe 3-10 seconds (tends to be random) it will output the following: Quote:
Quote:
I've looked ALL over google and everywhere else I can't find much, I found someone that said they compiled rsync with "ptmalloc" and that worked but I'm not sure if that was related to my exact error and I'm not sure how to compile rsync with ptmalloc. I'm using Ubuntu Server 10.04 fully updated, Rsync is verison 3.0.7. (edit): The server is a 2U server pretty new with 4 gigs of RAM and a 2.4ghz Intel Core2 Quad (quad core) so I don't believe it's an issue with the actual server memory but maybe a memory limit being imposed on the user or something? I do not believe it's an issue with the development server as I put SSH on heavy logging and it is showing it's the client that is disconnecting not an error on the server side. ANY and ALL help would be GREATLY appreciated, I've been a reader of these forums for probably 8+ years but never had to post much. Thanks guys and girls! |
@ Reply
Hi hazey11,
Things we need to know: 1. Did it ever work before? 2. As you are running rsync in verbose mode did you notice the file after which it throw this error? 3. If 2nd point is true, is it the same file on which it get stuck? 4. Roughly how many files are there in source location. 5. Did you try copying some of the files on the source location to a seperate directory and then run rsync on it? Did you try running with --max-size option? You can find out the average size of the file in the source location and then use it with --max-size="average file size" and see if that works. |
|
Quote:
1. Nope, this is the first time it's being setup. 2/3. Yes I did and it's seemingly random (different files), it does seem to hit error_log a lot in one of the folders but then i'll run it a few more times and it won't hit it again. 4. I'd say 4000+ files, it's a live web directory of various projects. I'm getting the exact number now but it's taking its time getting the count. 5. I have not, I would think it would work if it was under say 200-400 files as it seems to be working for maybe 5 seconds and gets all those files sync'd but then it always gets caught up by those errors, as mentioned in #2/3 it seems to be random. I'll give it a shot running it with --max-size option, do I just place that and it will return the average size? I'll try it like that and if it doesn't work I'll look it up and update this post if I have any luck. lithos: That first link is the one I came across mentioning to compile rsync manually with the ptmalloc library, I know how to compile it but not necessarily how to slip in the ptmalloc library? And I'm not sure if updating as the second link mentioned is the answer, I've updated to 3.0.7 and noticed no change. I can still upgrade to 3.0.8 via source but not sure if that is the source of the problem. Hmmmm. |
@ Reply
You're welcome.
No it will not return the average size by itself. You have to calculate the average size yourself by finding the largest file and the smallest file in that directory. You can use use the following command on that directory: 1.Get into that directory using Code:
cd /directory_where_files_are_kept Code:
du -ah As you said that you think it will work if there were only 200-300 files then try this: 1. Get the list of the file size using du and make a list of it: Code:
du -ah | sort -g > files-list-test.txt Note: As you said that you have got files on your production server I would suggest running du in OOH just to avoid any slowness issues during business hours |
All times are GMT -5. The time now is 07:00 PM. |