Quote:
Originally Posted by wazzu62
I have a Dell R900 running RHEL 5.1 and have a Quantum LTO-4 tape drive connected to a SAS1068E SAS card.It has quad 2.93 dual-core processors and 16GB of memory. The files I am backing up are connected to a SAN (2gb fabric)
I am using standard tar commands to write to the tape and am trying to improve performance. The drive is rated at 120MB/sec.
Using the following command I can see about 112-114 MB/sec throughput.
dd if=/dev/zero of=/dev/nst0 bs=1024k count=10240
but with tar I can only see about 8-14 MB/sec
tar -cvpf /dev/nst0 -b 512 /path/to/files --totals
I am backing up lots of small files, several directories each containing about 150,000 files. If I run a test and backup a single file (10GB) using the tar command I get about 100+ MB/sec throughput.
Are there other options I can look at with tar or settings I can set via the mt command? I suspect part of the problem is the sheer number of files in each directory.
Any help would be appreciated.
|
My guess would be that the 'dd' command is just shoveling raw data to the drive, where 'tar' is creating a single file from many, processing each one, adding it to the archive, verifying what's been written, etc. That's where your overhead is coming in.
I'd try to create the file to a temp directory on your hard drive, then copy that file to tape, and see what happens...