LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Server (https://www.linuxquestions.org/questions/linux-server-73/)
-   -   RSync error syncing over ssh driving me crazy (https://www.linuxquestions.org/questions/linux-server-73/rsync-error-syncing-over-ssh-driving-me-crazy-931304/)

hazey11 02-25-2012 07:31 PM

RSync error syncing over ssh driving me crazy
 
Hey Everyone,
I have a development server (remote), and a testing server on my local network. I'm trying to set it up so that files are synced from the development server to my local testing server so I can edit and make modifications (which I will then sync back to the development server when changes are stable - this part I'm not focusing on in this post).

So I setup rsync to work over ssh with a certificate and that part works fine. The only thing on the development server are HTML/PHP files and really not THAT many nor are there any large sized files.

The rsync cmd I use:
rsync -rlptvv --delete -e "ssh -i /root/rsync/mirror-rsync-key" [user]@[domain]:/home/[user]/public_html/ /var/www/

It connects fine, copies/checks files and after maybe 3-10 seconds (tends to be random) it will output the following:
Quote:

ERROR: out of memory in flist_expand [sender]
rsync error: error allocating core memory buffers (code 22) at util.c(117) [sender=3.0.6]
Then continue to list maybe 50-80 files are uptodate then stop with:
Quote:

rsync: connection unexpectedly closed (63062 bytes received so far) [receiver]
rsync error: error in rsync protocol data stream (code 12) at io.c(601) [receiver=3.0.7]
rsync: connection unexpectedly closed (173 bytes received so far) [generator]
rsync error: error allocating core memory buffers (code 22) at io.c(601) [generator=3.0.7]
Sometimes the first error will show with the rest but the last few times I've ran it, it's been showing high up in the window (as I said it seems to continue with more files before the second group of errors).

I've looked ALL over google and everywhere else I can't find much, I found someone that said they compiled rsync with "ptmalloc" and that worked but I'm not sure if that was related to my exact error and I'm not sure how to compile rsync with ptmalloc.

I'm using Ubuntu Server 10.04 fully updated, Rsync is verison 3.0.7.
(edit): The server is a 2U server pretty new with 4 gigs of RAM and a 2.4ghz Intel Core2 Quad (quad core) so I don't believe it's an issue with the actual server memory but maybe a memory limit being imposed on the user or something?

I do not believe it's an issue with the development server as I put SSH on heavy logging and it is showing it's the client that is disconnecting not an error on the server side.

ANY and ALL help would be GREATLY appreciated, I've been a reader of these forums for probably 8+ years but never had to post much. Thanks guys and girls!

T3RM1NVT0R 02-25-2012 09:20 PM

@ Reply
 
Hi hazey11,

Things we need to know:

1. Did it ever work before?
2. As you are running rsync in verbose mode did you notice the file after which it throw this error?
3. If 2nd point is true, is it the same file on which it get stuck?
4. Roughly how many files are there in source location.
5. Did you try copying some of the files on the source location to a seperate directory and then run rsync on it?

Did you try running with --max-size option? You can find out the average size of the file in the source location and then use it with --max-size="average file size" and see if that works.

lithos 02-26-2012 02:44 AM

Hi,

I don't have solution to any of the mentioned trouble, but
here is the same problem
and this suggests updating rsync to latest (and both have the same) version

hazey11 02-26-2012 01:34 PM

Quote:

Originally Posted by T3RM1NVT0R (Post 4612089)
Hi hazey11,

Things we need to know:

1. Did it ever work before?
2. As you are running rsync in verbose mode did you notice the file after which it throw this error?
3. If 2nd point is true, is it the same file on which it get stuck?
4. Roughly how many files are there in source location.
5. Did you try copying some of the files on the source location to a seperate directory and then run rsync on it?

Did you try running with --max-size option? You can find out the average size of the file in the source location and then use it with --max-size="average file size" and see if that works.

Thanks for your reply T3RM1NVT0R,
1. Nope, this is the first time it's being setup.
2/3. Yes I did and it's seemingly random (different files), it does seem to hit error_log a lot in one of the folders but then i'll run it a few more times and it won't hit it again.
4. I'd say 4000+ files, it's a live web directory of various projects. I'm getting the exact number now but it's taking its time getting the count.
5. I have not, I would think it would work if it was under say 200-400 files as it seems to be working for maybe 5 seconds and gets all those files sync'd but then it always gets caught up by those errors, as mentioned in #2/3 it seems to be random.

I'll give it a shot running it with --max-size option, do I just place that and it will return the average size? I'll try it like that and if it doesn't work I'll look it up and update this post if I have any luck.

lithos: That first link is the one I came across mentioning to compile rsync manually with the ptmalloc library, I know how to compile it but not necessarily how to slip in the ptmalloc library? And I'm not sure if updating as the second link mentioned is the answer, I've updated to 3.0.7 and noticed no change. I can still upgrade to 3.0.8 via source but not sure if that is the source of the problem. Hmmmm.

T3RM1NVT0R 02-26-2012 02:17 PM

@ Reply
 
You're welcome.

No it will not return the average size by itself. You have to calculate the average size yourself by finding the largest file and the smallest file in that directory. You can use use the following command on that directory:

1.Get into that directory using
Code:

cd /directory_where_files_are_kept
2.Run du command to find the largest and the smallest file in that location as follows:
Code:

du -ah
Once you calculated the average size (that is your choice if you want to go with lower end or upper end)you can use --max-size=<average file size> option with rsync. Suppose average size comes to 300 MB and the largest file size is 500 MB and the smallest file size is 100 MB then lower end I am referring to 200 MB (so your --max-size=<upper or lower limit>

As you said that you think it will work if there were only 200-300 files then try this:

1. Get the list of the file size using du and make a list of it:

Code:

du -ah | sort -g > files-list-test.txt
Using this list you can copy the large files in a seperate directory and try to run rsync on that.

Note: As you said that you have got files on your production server I would suggest running du in OOH just to avoid any slowness issues during business hours


All times are GMT -5. The time now is 07:00 PM.