You're, ummmm, "moving terabytes of files around," and "you've filled the array." Any computerized process slows down greatly when it starts to run out of storage-space. Don't assume that the issue is "flushing its disk caches." Instead, find out what it is.
You need more space. Double that allocation and spread-out the files among them. You probably also need a better algorithm than rsync if you know that the files aren't usually changing; or, specify that rsync can respect file-sizes and time-stamps to check for changes. Maintain a separate catalog database somewhere (SQLite?) to tell you what needs to be copied. This isn't a "generic" activity: you know an awful lot about it, and you will benefit by exerting that application-specific (human) knowledge and familiarity, in your fairly-customized solution of it.
It is very helpful to use the nice command to push-down the execution priority of that process, which you know to be "absolutely I/O-bound" anyhow. This will reduce somewhat the impact of this process on other activities. It'll spend nearly all of its time waiting for I/O, and, when it does get ready to execute again, it can well afford to be the low man on the totem-pole.
Last edited by sundialsvcs; 04-27-2012 at 09:28 AM.
|