LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 12-06-2021, 02:26 PM   #1
b1bb2
Member
 
Registered: Oct 2021
Posts: 90

Rep: Reputation: Disabled
Work within bandwidth limit


I have linux shared hosting with cpanel. I want to copy many large files within my hosting account. I tried to do this using simple copy commands in a bash script. It did the copy but seems to exceed my allocated bandwidth limit: I/O Usage turned red. I was later suprised to find that the bash script kept running even after I logged out. This would not be normal behaviour for a pc, but is it normal for online? I have more copy to do and would like a better method. I could try manually copy each seporate file but this is tedious. I could try cron but doc says I should use at, and I would need to setup a seporate job for each file. I suppose a good way is to put at statements into a bash script. But I can not get at working. I tried:
Code:
at -f file.sh now
You are not authorized to run this command
Code:
/bin/sh/at -f file.sh now
bash: /bin/sh/at: Not a directory
Code:
man batch
bash: man: command not found

I tried experimenting on my offline pc. at is a package to install. I read I need to run
Code:
sudo systemctl enable --now adt
It does not work. Do I need to setup LAMP?
 
Old 12-06-2021, 04:49 PM   #2
wpeckham
LQ Guru
 
Registered: Apr 2010
Location: Continental USA
Distribution: Debian, Ubuntu, RedHat, DSL, Puppy, CentOS, Knoppix, Mint-DE, Sparky, VSIDO, tinycore, Q4OS, Manjaro
Posts: 5,797

Rep: Reputation: 2775Reputation: 2775Reputation: 2775Reputation: 2775Reputation: 2775Reputation: 2775Reputation: 2775Reputation: 2775Reputation: 2775Reputation: 2775Reputation: 2775
1. Rsync has bandwidth control, if you need that. It also uses ssh tunnels by default for remote connections, which provides some security where needed.

2. at is system scheduling and requires root authority using sudo, doas, or other escalation command.

3. cron has a user space control and can run scripts or processes under your personal account. Look up crontab "man crontab" for details.

You have embarked upon a journey of discovery! This is fun stuff with MANY moving parts. Welcome to the jungle, I hope you enjoy the ride!
 
Old 12-06-2021, 05:41 PM   #3
Turbocapitalist
LQ Guru
 
Registered: Apr 2005
Distribution: Linux Mint, Devuan, OpenBSD
Posts: 7,386
Blog Entries: 3

Rep: Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776
1. Rsync is the way to go for copying to or from remote systems. It can reach anywhere you can log in via SSH.

Code:
rsync -aHv /local/source/directory/ b1bb2@server.example.com:/destination/directory/

# or 

rsync --archive --hard-links --verbose --progress /local/source/directory/ b1bb2@server.example.com:/destination/directory/
Be sure to use the trailing slash when naming directories. It can also transfer individual files. See "man rsync" for the full reference manual.

2. Any account can use at as long as it is not listed /etc/at.deny or, if at.allow exists, then the account must also be listed there.

Code:
sudoedit /etc/at.deny
test -f /etc/at.allow && sudoedit /etc/at.allow
 
Old 12-06-2021, 07:27 PM   #4
b1bb2
Member
 
Registered: Oct 2021
Posts: 90

Original Poster
Rep: Reputation: Disabled
This copying is all within remote system, data does not enter or leave the system. This bash script on server, does the copy but it also causes a bandwidth warning 4 MB/s.
Code:
cp public_html/Videos/western/*/*.avi public_html/admidio/adm_my_files/documents_video/western/;
cp public_html/Videos/western/*/*.mkv public_html/admidio/adm_my_files/documents_video/western/;
Terms of service says
Quote:
No single shared hosting account is permitted to use more than 20% of the server resources at a time.
Will this bash script work? Please help with synopsis. The && ensures that the command finishes before the next command starts. Is it needed? Am I right? After I start the script, can I log out and wait for it to finish?
Code:
 
rsync --bwlimit=2 public_html/Videos/western/*/*.avi public_html/admidio/adm_my_files/documents_video/western/ &&;
rsync --bwlimit=2 public_html/Videos/western/*/*.mkv public_html/admidio/adm_my_files/documents_video/western/ &&;
 
Old 12-07-2021, 12:01 AM   #5
b1bb2
Member
 
Registered: Oct 2021
Posts: 90

Original Poster
Rep: Reputation: Disabled
I did research and it looks like this will work. I am marking this as solved. I do not need to run the code yet. You still have time to warn me if needed. --dry-run will be removed later.
MB/s = megabyte per second
KBPS = kilobyte per second
4 megabyte per second=4000 kilobyte per second
Code:
 
rsync --dry-run --bwlimit=2000 public_html/Videos/western/*/*.avi public_html/admidio/adm_my_files/documents_video/western/;
rsync --dry-run --bwlimit=2000 public_html/Videos/western/*/*.mkv public_html/admidio/adm_my_files/documents_video/western/;
 
Old 12-07-2021, 12:06 AM   #6
syg00
LQ Veteran
 
Registered: Aug 2003
Location: Australia
Distribution: Lots ...
Posts: 21,163

Rep: Reputation: 4125Reputation: 4125Reputation: 4125Reputation: 4125Reputation: 4125Reputation: 4125Reputation: 4125Reputation: 4125Reputation: 4125Reputation: 4125Reputation: 4125
Do you really need to copy the data - would symlinks not suffice ?.
 
Old 12-07-2021, 12:31 AM   #7
Turbocapitalist
LQ Guru
 
Registered: Apr 2005
Distribution: Linux Mint, Devuan, OpenBSD
Posts: 7,386
Blog Entries: 3

Rep: Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776Reputation: 3776
Also, that will put the files inside public_html/admidio/adm_my_files/documents_video/western/ itself, and not make any subdirectories within that target. The approach to make the subdirectories, while including or excluding files, is a little non-intuitive:

Code:
rsync --dry-run --archive --hard-links --verbose --bwlimit=2000 \
        --include='*/' --include='*.avi' --include='*.mkv' --exclude='*' \
        public_html/Videos/western/ \
        public_html/admidio/adm_my_files/documents_video/western/
If the target is on the same file system, then hard links might be fine, too, but trickier.

Code:
rsync --dry-run --archive --hard-links --verbose --bwlimit=2000 \
        --link-dest=public_html/Videos/western/* \
        --include='*/' --include='*.avi' --include='*.mkv' --exclude='*' \
        public_html/Videos/western/ \
        public_html/admidio/adm_my_files/documents_video/western/
However, in the case of symbolic links, find may be the more appropriate tool.

If you are moving the files, then there are options to delete files after the transfer is complete.
 
Old 12-07-2021, 03:22 AM   #8
b1bb2
Member
 
Registered: Oct 2021
Posts: 90

Original Poster
Rep: Reputation: Disabled
To: syg00, LQ Veteran
Thank you for the idea. I would prefer that. The answer is up to Admidio. I searched Admidio Wiki and Forum for keyword symlink and hard link. Nothing was found. It looks like a good test is to put symlink and hard link into Admidio and see what happens. I hoped someone would already know and tell me. The surest way is to do what doc says and put files in thier place. I have not worked much with symlink and hard link. I need to research how to do them. It looks like Turbocapitalist gave one method.

To: Turbocapitalist, LQ Guru, Contributing Member
I am not sure, but it looks like your rsync syntax creates subdirectories. If I want them, I will use what I am more familiar with: mkdir -p. I do not wish to delete the original files. I will link to them if it will work.

To all:
I have not heard otherwise, so it looks like the surest method is this. I repeat.
Code:
 
rsync --dry-run --bwlimit=2000 public_html/Videos/western/*/*.avi public_html/admidio/adm_my_files/documents_video/western/;
rsync --dry-run --bwlimit=2000 public_html/Videos/western/*/*.mkv public_html/admidio/adm_my_files/documents_video/western/;
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
How can I limit bandwidth and also measure my bandwidth available superbit Linux - Networking 1 07-05-2018 03:47 AM
question: 'onclick' within 'onmouseover' within 'form' within 'table' - how is it possible? rblampain Programming 4 04-25-2017 08:49 PM
Limit bandwidth of a network interface : tc and iptables doesn't work ? MathX Linux - Networking 4 10-12-2010 02:41 PM
Limit Bandwidth as Data Volume Approaches Limit sweno Linux - Server 1 06-12-2008 07:10 AM
Vsftpd - how to limit download bandwidth but unlimited upload bandwidth? mpls mikeg Linux - Software 3 08-13-2005 01:52 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 10:01 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration