Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I am new to linux. I am trying to copy the data from one disk to another.
i have already mounted /dev/sdf on rdf. But when i try to create a directory inside rdf iam getting
1. "cannot create directory.Read only file system"
2. Also iam not able to umount rdf
I have some other queries:
1. How to list the current processes running on in linux(in order to terminate some process which consumes more cpu usage)?
2. How to check my linux server's processor/ram details
3. It's taking a week's time for taking backup.I need to reduce the time.What can be done to improvise the process
You can get a summary of the current cpu/mem/disk by running 'top'.
As for backups, we'd need to know a lot more detail eg what your system specs are and what/how much you are trying to backup? What technique/code are you currently using to backup?
we are trying the backup using amanda resource.total size is around 6tb..but its taking a week's time for the process to take place.your suggestion pls
any idea about roaming profiles..
we just migrated from linux ad to windows creating a new domain. but when i try to login with the same (old domin's user) in the new domain login, iam not able to get the roaming profile.error --> "could not locate the server copy of your roaming profile". windows now loging in temporarily.
any idea to make it work properly,without any issues.?
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,197
Rep:
With respect to backing up 6tb -- this will depend on the configuration and tuning of your system. Amanda uses native utilities. I presume you've configured it to use gnutar and are not trying to compress or encrypt? Have you tried just doing a straight copy or gnutar to see how fast it goes? If you describe your hardware and configuration, perhaps someone could make some suggestions.
This CMS Automatic Backup System Plus V2 is a 1.5TB, external hard disk drive.Compared to other external hard drives on the market, it is expensive at around $225.Connects via either FireWire or USB 2.0 for increased flexibility.This is a 7200 rpm drive and comes with a 32MB buffer.
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,197
Rep:
hmm. Not much to go on here. That's just the external drive. It ought to be able to take the backup faster than you reported earlier. However, you originally indicated you were trying to backup 6tb, then you report that you are using a 1.5TB drive to do it. Which is right?
(Edit: My mistake. I thought the OP was coming back with information on his drive. Still waiting to hear.)
What is the system you are backing up from? What else is it doing? Have you tried just doing copies or tars to see how fast it can go?
Last edited by choogendyk; 12-21-2009 at 07:39 PM.
Actually iam trying to take the backup of linux server which is around 6 tb. for that we are using 4 HDD each of 1.5 tb.
now when i start the backup it takes around 74 hrs.normally it should not take this much time.
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,197
Rep:
You still aren't telling much about your configuaration. But it sounds like a potential tight squeeze. You're backing up 6TB and using 4x1.5=6TB of space to back it up. How is that configured? What are the original disk list entries? What sort of values do you have for dumpcycle, runspercycle and tapecycle? Sounds like you probably have no holding disk. Right? Have you broken up the source volumes to be backed up into smaller pieces in the disklist file? And, what is the hardware configuration? How are the drives connected? What kind of drives are they? What sort of server?
Its an AMD server.we are using holding disk...
the backupcycle is 15 days...dumpcycle???
pls check the file attached for the other details. .right now we have connected the external HDD to the amanda server. so the backup from the client(linux server) will fall into the HDD.since its through network, its taking quite a long time....
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,197
Rep:
You can find those parameters (e.g. dumpcycle) in your amanda.conf file in the configuration directory, which might be someplace like /etc/amanda/DailySet1/amanda.conf. Or, you could use amgetconf (do a man page `man amgetconf`) to list configuration parameters.
So, you're pulling 6TB over the network, and you have an AMD linux server with 2G memory. Still not giving us much to go on. Is your network 10Mb, 100Mb or 1000Mb (GigE)? Is there lots of other traffic or congestion on it? Have you tested other transfers for performance using ftp or scp? Are you running the backup through an ssh pipe? How fast are the cpu's on the servers? How much overall disk space? How large is the holding disk? And, again, what does your Amanda disklist look like? Are you breaking up the 6TB into smaller pieces for backup?
That external drive you've connected to the Amanda server -- is that the holding disk? Or is that being used as virtual tapes for backup? How large is it, and how is it configured?
There are a lot of questions that need answers before we can help you. However, making guesses, and assuming you have no fundamental problems with network performance or server capabilities, the way I would try to back up 6TB across the network would be to break it up into a large number of smaller pieces by defining them in Amanda's disklist. Then I would have a dumpcycle of somewhere between a week and a month, with nightly backups. The runspercycle would be the number of days in a dumpcycle. The tapecycle would be something larger than the dumpcycle. This way, the full backups of individual parts of the 6TB would be distributed over the dumpcycle and not all done at once. When you start up a configuration like that, add a few disklist entries to the disklist every night until it gets going. Otherwise, it will need to start out with all full, since incrementals don't make sense without a full. Also, if it is on a secure internal network, I would not push it through ssh since that adds significant overhead to the transfer. And, as far as compression goes, that would depend on where you have the cpu cycles. If your backup server is faster and isn't doing other things, do server side compression. If the backups were coming from a larger number of other servers, all of which had significant cpu capacity, and the backup server was more limited, then I would do client side compression.
Anyway, maybe that will give you something to work with.
I used to schedule and check the status zmc
Network speed is 100mbps
holding disk space-15 vtapes and 225 tb (max space used)
total backup size is 6tb which comprises 2 fileservers. but i tried with only one server which itself took 74 hrs.ima not breaking...iam just adding all the directories via zmc backup what page and scheduling the
backup.
no idea how the disk was configured.might be through LVM.(four seperate HDD were configured as one-BigDisk)
tried with amgetconf daily logdir.....but getting as"no such a file or directory
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.