LinuxQuestions.org
Latest LQ Deal: Complete CCNA, CCNP & Red Hat Certification Training Bundle
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 07-19-2010, 03:10 PM   #1
richman1234
LQ Newbie
 
Registered: Mar 2010
Distribution: Fedora12 & Angstrom
Posts: 22

Rep: Reputation: 0
HOLY COW - I just deleted all of my code. HELP!!!


HELP! OK, so I was in my main projects directory and wanted to see what was in one of the other directories so instead of cd into that directory and then listing the files I just ls <directory_name>.

Now I am looking at all the files in a directory I am not really in. Then I rm * to delete all the files in that directory....BUT I'm REALLY IN MY MAIN PROJECTS DIRECTORY. Now all my code is gone.

Is there a way of getting this back??? I haven't touched anything after I did it.

Please help,
Rich

Last edited by richman1234; 07-19-2010 at 03:11 PM.
 
Old 07-19-2010, 03:26 PM   #2
z99
Member
 
Registered: Aug 2009
Location: Iran-Sari
Distribution: CentOS,Fedora,Slitaz
Posts: 136

Rep: Reputation: 20
hi,i just found this site,maybe it can be helpful,give it a try:
http://www.linux.com/news/enterprise...our-hard-drive
 
Old 07-19-2010, 03:32 PM   #3
pixellany
LQ Veteran
 
Registered: Nov 2005
Location: Annapolis, MD
Distribution: Arch/XFCE
Posts: 17,802

Rep: Reputation: 738Reputation: 738Reputation: 738Reputation: 738Reputation: 738Reputation: 738Reputation: 738
When you are working in a terminal, there is normally no undo command.

If you do not write to the drive, you can probably recover something using photorec: http://www.cgsecurity.org/wiki/PhotoRec.

If the data is really valuable, then consider cloning the drive before attempting recovery. You can clone to an external USB drive (which can also be used for backups once the crisis is past).
 
Old 07-19-2010, 03:40 PM   #4
Dinithion
Member
 
Registered: Oct 2007
Location: Norway
Distribution: Slackware 14.1
Posts: 446

Rep: Reputation: 59
Professional in this situation advice you to unplug the power cord to your computer (Without turning it off first!), and send the harddrive in for analysis. The more you use the computer where you deleted the content, the larges is the chance of data corruption.

Analysis is very expensive, I would guess somewhere between $3000 and $5000 for professional work. (Based on local prices here). But then again, you are probably getting everything back as if nothing happened, if you followed the above guidance.

If you don't have money to send this in for professional help, there are software that probably can help you. The downside of using software is that it increase the chance of data corruption. So if the software doesn't work, you might not get any data afterward if you send it in for professional analysis.

Edit:
Uhm. I mixed up the words here. Analysis is supposed to be recovery.
(Funfact: The reason i mixed up is that most professional first do an analysis. This often costs between $1k and $2k, and give you a probability of success of actual recovery. If you accept the recovery attempt, they will charge you a few thousand dollars more for the actual recovery).

Last edited by Dinithion; 07-19-2010 at 03:57 PM.
 
Old 07-19-2010, 03:42 PM   #5
Telengard
Member
 
Registered: Apr 2007
Location: USA
Distribution: Kubuntu 8.04
Posts: 579
Blog Entries: 8

Rep: Reputation: 147Reputation: 147
If your files were stored on an Ext-3 volume then you will likely have to work much harder to get them back. Cease all activity on that volume and write nothing to it before proceeding with recovery attempts. Sadly I can't offer any more help than that.

In the future you may wish to abandon use of the rm command with wildcards. As you have just discovered it is merciless. If you insist on using rm with wildcards then please consider appending the -i and -v options to make it a little less merciless. For example:

Code:
$ rm -i -v foo
rm: remove regular file `foo'? y
removed `foo'
$ rm --help
  -i                    prompt before every removal
  -v, --verbose         explain what is being done
Another option is to create a folder inside your home directory called junk and then make an alias of rm to the command mv -i -v -t $HOME/junk.

Another option is the Python program trash-cli which is a command for moving files to the trash bin.

Good luck with your recovery attempts.

Last edited by Telengard; 07-19-2010 at 06:40 PM. Reason: remove extra slash from mv command
 
Old 07-19-2010, 05:36 PM   #6
jay73
LQ Guru
 
Registered: Nov 2006
Location: Belgium
Distribution: Ubuntu 11.04, Debian testing
Posts: 5,019

Rep: Reputation: 130Reputation: 130
Photorec should work. However, file names are impossible to recover (being one of the things that actually get wiped out when a file is deleted). Providing you get everything back, you'll still need to inspect each and every individual file to figure out what is inside and what it should be renamed to.
 
Old 07-19-2010, 06:02 PM   #7
Andrew Benton
Senior Member
 
Registered: Aug 2003
Location: Birkenhead/Britain
Distribution: Linux From Scratch
Posts: 2,073

Rep: Reputation: 64
In future use git and have several repositories on different computers in different buildings
 
Old 07-20-2010, 08:14 AM   #8
richman1234
LQ Newbie
 
Registered: Mar 2010
Distribution: Fedora12 & Angstrom
Posts: 22

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by Telengard View Post
If you insist on using rm with wildcards then please consider appending the -i and -v options to make it a little less merciless.

Another option is to create a folder inside your home directory called junk and then make an alias of rm to the command mv -i -v -t $HOME/junk.
Thanks for all your comments. Things don't seem as bad this morning. Most of the code was test code, written to communicate with peripherals, so most of the time was learning how it's done, not in the actual coding. Now that I know how to do it, re-coding will not take as much time.

I have also found bits and pieces of code printed out and saved on a thumb drive, so that will help getting it back in. The biggest aggravation is that I had just finished really polishing several big functions and was in the process of cleaning up when I deleted everything.

Thanks for not calling me names for not backing up, and for using "rm *". The 'funny' thing is I did the opposite from the advice above. My alias is " rm = rm -r -I " It's setup to delete everything with out asking. I think I will change that.

I really like the idea of making the alias of ' rm ' as a move command into a junk folder.

Another hard lesson learned for a Linux newbie!
 
Old 07-20-2010, 08:24 AM   #9
syg00
LQ Veteran
 
Registered: Aug 2003
Location: Australia
Distribution: Lots ...
Posts: 15,424

Rep: Reputation: 2017Reputation: 2017Reputation: 2017Reputation: 2017Reputation: 2017Reputation: 2017Reputation: 2017Reputation: 2017Reputation: 2017Reputation: 2017Reputation: 2017
Backups.
.
.
.
 
Old 07-20-2010, 08:35 AM   #10
pixellany
LQ Veteran
 
Registered: Nov 2005
Location: Annapolis, MD
Distribution: Arch/XFCE
Posts: 17,802

Rep: Reputation: 738Reputation: 738Reputation: 738Reputation: 738Reputation: 738Reputation: 738Reputation: 738
Quote:
Originally Posted by syg00 View Post
Backups.
.
.
.
Absolutely flipping brilliant!!! I wonder why no-one else in this thread thought of that?
 
Old 07-20-2010, 09:23 AM   #11
Telengard
Member
 
Registered: Apr 2007
Location: USA
Distribution: Kubuntu 8.04
Posts: 579
Blog Entries: 8

Rep: Reputation: 147Reputation: 147
Quote:
Originally Posted by richman1234 View Post
Thanks for not calling me names for not backing up, and for using "rm *".
I'm not going to call you names because I've done the same myself. As you said, lesson learned the hard way. Backups would have saved you a lot of work this time.

Quote:
I really like the idea of making the alias of ' rm ' as a move command into a junk folder.
I think you could also automate emptying of the junk folder by placing rm -R $HOME/junk/* into the script $HOME/.bash_logout. I don't really recommend automating it though because then you won't have any chance to review what will be deleted.

Some people create an archive /tmp/junk.tar and add files to it. Anything in the /tmp directory will be deleted next reboot, at least that is what I have observed on my own systems. As before, automating the delete process takes away your opportunity to review files before deleting.

Last edited by Telengard; 07-20-2010 at 09:28 AM.
 
Old 07-22-2010, 10:17 AM   #12
richman1234
LQ Newbie
 
Registered: Mar 2010
Distribution: Fedora12 & Angstrom
Posts: 22

Original Poster
Rep: Reputation: 0
Thanks for all the suggestions. This is what I decided to do to keep from making the same mistake.

1) made a back-up script to run at the end of each day.
Code:
#!/bin/bash
BU="$HOME/Backup_Files/`date +%Y-%m-%d`"      #assigns a directory named todays date to BU 
mkdir $BU                                     #makes the directory
tar -cvzf $BU/Projects.tar.gz ~/Projects      #zips all the files in ~/Projects ans save to new dir

2) changed the alias for "rm" in the .bashrc file to run a script
alias rm='$HOME/bin/./rm'

Code:
#!/bin/bash
BU="$HOME/Removed_Files/`date +%Y-%m-%d`"     #assigns a directory named todays date to BU     
if [ ! -d "$BU" }; then                       #checks to see if directory exists. 
   mkdir $BU                                  #if not it makes the directory.
fi

mv -f -n $1 $BU/$1-`date +%H:%M:%S`           #moves the input argument in to the new directory
                                              #and appends the time to the end of its name.
I think this is pretty cool seeing that I'm new to all of this. If anyone has any comments, I would love to hear them. I thought this might help someone not make the same mistake.
 
Old 07-22-2010, 11:36 AM   #13
Dinithion
Member
 
Registered: Oct 2007
Location: Norway
Distribution: Slackware 14.1
Posts: 446

Rep: Reputation: 59
That sounds like a good idea.

If you have larger projects I would also recommend you to use some sort of version control. Not only does it give an additional security, every version of a project is remembered. This is extremely convenient in my opinion. So if you screw up you can roll back to the last version. In fact, you can retrieve any version.

I use this for all sorts of things. Mostly administrative documents for a small organization and my homework. If you run the version controll server at your local machine, it won't give you an extra backups, but if you have a server and run version control there, you can retrieve the last version of your project as if nothing happened if you accidentally delete your project. So if you have a large project, you found this interesting, I would recommend you to start reading a little about it.

Some links that might help:
General info: http://en.wikipedia.org/wiki/Revision_control
Different versions: http://en.wikipedia.org/wiki/List_of...ntrol_software

Svn, Git, cvs and mercurial I guess is the most used on open source systems.

Last edited by Dinithion; 07-22-2010 at 11:39 AM.
 
Old 07-22-2010, 01:49 PM   #14
Telengard
Member
 
Registered: Apr 2007
Location: USA
Distribution: Kubuntu 8.04
Posts: 579
Blog Entries: 8

Rep: Reputation: 147Reputation: 147
Looks like a good strategy to me. You should also be making regular backups to external storage as well. It all depends on how valuable the data is to you, and whether or not you need to keep older versions of the same files.

Test these scripts thoroughly before you begin trusting them. Check for any errors emitted by them and fix as needed. If you are automating them then check to make sure they are running when they should. Test them against files with spaces and other non-alnum characters in their names. Test them against files with unusual permissions. Be absolutely certain that they are doing what you think they should before you trust them.
 
Old 07-22-2010, 03:21 PM   #15
CincinnatiKid
Member
 
Registered: Jul 2010
Posts: 450

Rep: Reputation: 47
I would make an image of the partition

Code:
dd if=/dev/[partition] of=[location of image file]
Make 2 copies of this image and do all of your analysis on the image. Without getting fancy, you can try a strings analysis of the partition (might be a lot of output if it is a large partition).

Code:
strings [image file] > output_file
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
EUID,UUID,LVM,LUN,cOW pavan.manipal Linux - Software 3 07-27-2009 12:10 PM
bouncing cow screensaver and others fakie_flip Linux - Software 2 08-20-2006 11:39 PM
Bouncing Cow ... Doesn't rickh Linux - Software 2 03-08-2006 12:38 PM
Holy Cow! Fast Downloads! coasterfreak212 General 23 08-17-2004 08:03 PM
Holy COW! tcaptain General 15 03-06-2003 03:42 PM


All times are GMT -5. The time now is 04:08 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration