What is the best action to take if you delete a file and want to recover it?
Linux - DesktopThis forum is for the discussion of all Linux Software used in a desktop context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
What is the best action to take if you delete a file and want to recover it?
No, I haven't done anything stupid- but I have a hunch that I'm going to use rm (or god forbid something like rm -fr) recklessly some day and delete something important.
And because luck favors the prepared, I wanted to know what would be the best course of action to take in such an event? I'm guessing you'd first of all want to shut down you computer right away (because some programs or daemons could overwrite the place where you data was right?) and then you'd download some file recovery software for Linux?
This is just a guess though and I have no idea how it would e.g. vary between file systems.
I'm guessing you'd first of all want to shut down you computer right away (because some programs or daemons could overwrite the place where you data was right?) and then you'd download some file recovery software for Linux?
You're partly right- the data area can be quickly overwritten, but downloading a recovery tool after the fact would also do the same...
Here are a couple of tutorials on the subject: #1 and #2.
Cheers
Last edited by DragonSlayer48DX; 02-17-2009 at 07:21 AM.
The best action to take is not to take that action ... or to prevent it. What I did is I make a wrapper script around rm, called it srm (safe rm) and I use that as root. The script prevents you from deleting top-level directories.
Code:
#!/bin/sh
# this script takes only 2 arugments total
if test "$#" != 2
then
# fail
echo 'ERROR: This script requires exactly 2 arguments'
exit 1
fi
# don't delete things 2 levels from the root directory
if find / -type d -maxdepth 2 | grep "$2" 1> /dev/null
then
# fail
echo "ERROR: Bad idea, will not remove $2"
exit 1
fi
rm "$1" "$2"
# success
exit 0
If you want, another strategy is to make a blacklist of files that you don't want deleted, probably you can generate it using 'find'.
That is to say "I removed my home directory 2 seconds ago, what should I do RIGHT NOW?", because I want to know what I should do
And I'm saying that you should prevent this from happening, because that's the only reliable way. You can go ahead and try all the methods above as well as: http://tldp.org/HOWTO/Ext2fs-Undeletion-4.html http://tldp.org/LDP/LGNET/156/misc/l..._recovery.html
also data carvers like foremost and testdisk will come in handy. BUT, it's very likely that these methods will almost never lead to 100% recovery. So, the best way is to prevent it from happening.
The best action to take is not to take that action ... or to prevent it. What I did is I make a wrapper script around rm, called it srm (safe rm) and I use that as root. The script prevents you from deleting top-level directories.
Code:
#!/bin/sh
# this script takes only 2 arugments total
if test "$#" != 2
then
# fail
echo 'ERROR: This script requires exactly 2 arguments'
exit 1
fi
# don't delete things 2 levels from the root directory
if find / -type d -maxdepth 2 | grep "$2" 1> /dev/null
then
# fail
echo "ERROR: Bad idea, will not remove $2"
exit 1
fi
rm "$1" "$2"
# success
exit 0
If you want, another strategy is to make a blacklist of files that you don't want deleted, probably you can generate it using 'find'.
I was searching for this kind of thread , script seems to be ok for me , but i need to knw hw u did u make wrapper , i dont knw to make a wrapper , can u cite frm above script how did u made it .. Any way thanks for the script , if possible can u please explain hw this script works ...!!!!!
You copy all that into a text file called 'srm'. Then you make the text file executable. Then you move or symlink the text file into your path (type 'echo $PATH' to see where).
How the script works is rather simple:
Code:
if find / -type d -maxdepth 2 | grep "$2" 1> /dev/null
then
# fail
echo "ERROR: Bad idea, will not remove $2"
exit 1
fi
This is the main piece of code. It uses find to list all directories 2 levels deep from / and if grep finds a match among those the script will refuse to delete.
Last edited by H_TeXMeX_H; 02-18-2009 at 03:24 AM.
One day, I managed to delete the contents of /bin (or was it /usr/bin? I forget...).
Sure was a good thing that I use rsync for a daily backup regimen. I went into the backup directory, and copied the entire contents of the missing directory back into place.
Distribution: Ubuntu 11.4,DD-WRT micro plus ssh,lfs-6.6,Fedora 15,Fedora 16
Posts: 3,233
Rep:
i would have to agree that while possible, data recovery is dicey at best, and should be a last resort
having a backup to fall back on would be the best option regardless of the method of backing up
i lost count of how many times i ended up reinstalling linux on my machine when i was learning it became part of my routine for a while, now the only time i reinstall linux is to try a different distro or have a fresh machine to install
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.