Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Can someone please explain if there an application which does the following:
1. I have a laptop with an internal 200GB HD.
2. I run the application & it creates a list of all files (size & time-stamp) without actually storing them. Let's call this the "snapshot list".
3. I update some of the files on the laptop.
4. Now I run the application & it only copies the files which have changed on the laptop, that have different size/time-stamp from the snapshot list, onto some external media, such as a memory card. Of course, the files should be copied onto their proper location in the directory tree & not just pile up in one place.
Why is this useful? although the laptop has a 200GB HD I typically only update a small number of files, whose total size is maybe 10MB or so. If I could only backup those which have changed, I could do this with a tiny SD card instead of lugging around an external usb HD.
If I am understanding this correctly, you would like a script that takes a list of recently updated files and have them copied to a flash drive of some kind, right?
The way you are wording this is a little confusing.
Quote:
2. I run the application & it creates a list of all files (size & time-stamp) without actually storing them. Let's call this the "snapshot list".
There is definitely a way to create a script to do this. First you would want to call out the directory, list the files with the sizes and last date changed, run a couple of conditional statements, point the output to copy to your flash media, and voila you have your application.
man ls
man cut
man cp
google if statements
google storing variables..........etc.
I am sure there are some great programmers that could give you better hints than I, but I think this might be a classroom assignment. I could be wrong, but..........?
Thanks for replying. This is not a classroom assignment, just sometime which I think can be useful.
it is probably doable by hand but it may be incredibly inefficient, especially when examining HD of 100s of GB. I'll try to rephrase the scenario:
1. I"ve got a laptop with an internal HD, say 200GB.
2. The application scans over all HD files and stores their current size & time-stamp.
3. I work on the laptop, and update some files.
4. I re-run the application and it now compares each file to its stored size & time-stamp. any files which have been changed are copied into another media for backup. however, they are stored in their proper directory location and not just dumped into one directory.
Hope this is clearer. aren't there application out there already which do this kind of smart-backuping?.
Thanks. I am familiar with rsync though I can see an option to create a list of files with their current size and time-stamp without copying the entire file. or am I missing something?.
There is another way to approach this, have you ever heard of checksum? Checksum is an algorithm that looks at files and compares their sizes to make sure you have a good copy, that there are no malicious programs attached to it, and that you have the most recent copy. Of course, the tool is only as good as the datum.
What I would do is use rsync, it uses the checksum methodology/algorithm. It will reduce the amount of commands in your shell by removing the cp, checksum, and ls, because if it is used properly it will do all of it (locally and remotely).
You are going to have to write the script, but the people on this forum are incredibly helpful. There are two reasons they are helpful, they will not give you the fish, they will teach you to fish!
Remember, have fun! There is nothing more rewarding in UNIX/Linux than creating a working tool that automates tasks.
#!/bin/bash
function direc {
if [ ! -d backup ]
then
mkdir backup
fi
}
copy () {
if [ ! -d backup ]
then
echo "The backup folder does not exist."
else
for file in *
do
if [ $file -nt backup/$file.bak ] && [ -f $file ]
then
cp -v $file destination/$file
else
echo "The file $file was not modified since last copy."
fi
done
fi
}
backup () {
direc
for files in *
do
if [ -f $files ]
then
cp -v $files backup/$files.bak
fi
done
}
while getopts cb opts
do
case $opts in
c) copy;;
b) backup;;
esac
done
obviously, this will work if you want to back the files up also, but you said you didnt want to do that...you would have to change "destination" to wherever your media is mounted on (df -h)
but if you decide to use this, just call it with the appropriate options, so if you name it "filecopy.sh" then youd call it with "filecopy.sh -b" to create the backup, or renew them, and "filecopy.sh -c" to copy the new files to your drive
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.