Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have linux shared hosting with cpanel. I want to copy many large files within my hosting account. I tried to do this using simple copy commands in a bash script. It did the copy but seems to exceed my allocated bandwidth limit: I/O Usage turned red. I was later suprised to find that the bash script kept running even after I logged out. This would not be normal behaviour for a pc, but is it normal for online? I have more copy to do and would like a better method. I could try manually copy each seporate file but this is tedious. I could try cron but doc says I should use at, and I would need to setup a seporate job for each file. I suppose a good way is to put at statements into a bash script. But I can not get at working. I tried:
Code:
at -f file.sh now
You are not authorized to run this command
Code:
/bin/sh/at -f file.sh now
bash: /bin/sh/at: Not a directory
Code:
man batch
bash: man: command not found
I tried experimenting on my offline pc. at is a package to install. I read I need to run
1. Rsync has bandwidth control, if you need that. It also uses ssh tunnels by default for remote connections, which provides some security where needed.
2. at is system scheduling and requires root authority using sudo, doas, or other escalation command.
3. cron has a user space control and can run scripts or processes under your personal account. Look up crontab "man crontab" for details.
You have embarked upon a journey of discovery! This is fun stuff with MANY moving parts. Welcome to the jungle, I hope you enjoy the ride!
This copying is all within remote system, data does not enter or leave the system. This bash script on server, does the copy but it also causes a bandwidth warning 4 MB/s.
No single shared hosting account is permitted to use more than 20% of the server resources at a time.
Will this bash script work? Please help with synopsis. The && ensures that the command finishes before the next command starts. Is it needed? Am I right? After I start the script, can I log out and wait for it to finish?
I did research and it looks like this will work. I am marking this as solved. I do not need to run the code yet. You still have time to warn me if needed. --dry-run will be removed later.
MB/s = megabyte per second
KBPS = kilobyte per second
4 megabyte per second=4000 kilobyte per second
Also, that will put the files inside public_html/admidio/adm_my_files/documents_video/western/ itself, and not make any subdirectories within that target. The approach to make the subdirectories, while including or excluding files, is a little non-intuitive:
To: syg00, LQ Veteran
Thank you for the idea. I would prefer that. The answer is up to Admidio. I searched Admidio Wiki and Forum for keyword symlink and hard link. Nothing was found. It looks like a good test is to put symlink and hard link into Admidio and see what happens. I hoped someone would already know and tell me. The surest way is to do what doc says and put files in thier place. I have not worked much with symlink and hard link. I need to research how to do them. It looks like Turbocapitalist gave one method.
To: Turbocapitalist, LQ Guru, Contributing Member
I am not sure, but it looks like your rsync syntax creates subdirectories. If I want them, I will use what I am more familiar with: mkdir -p. I do not wish to delete the original files. I will link to them if it will work.
To all:
I have not heard otherwise, so it looks like the surest method is this. I repeat.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.