LinuxQuestions.org
Review your favorite Linux distribution.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 05-31-2013, 08:16 AM   #1
jagdish.jagtap1@wipro.co
LQ Newbie
 
Registered: May 2013
Posts: 8

Rep: Reputation: Disabled
Red face undoing previous command


by mistake i have unzipped one package in my home directory instead of unzipping it in new_folder, now i am having n number of files in my home directory, i dont know which files were already present,
can i just rollback my previous command??
is there any other way to delete all extracted files,
i am using VM through ssh so i have only terminal to operate.
thanks in advance..
 
Old 05-31-2013, 08:26 AM   #2
druuna
LQ Veteran
 
Registered: Sep 2003
Posts: 10,532
Blog Entries: 7

Rep: Reputation: 2387Reputation: 2387Reputation: 2387Reputation: 2387Reputation: 2387Reputation: 2387Reputation: 2387Reputation: 2387Reputation: 2387Reputation: 2387Reputation: 2387
No undo command is present, you'll need to clean your home directory by hand.

First extract the zip file in the new_folder directory, which gives you a reference to what needs to be cleaned in your home directory.
 
2 members found this post helpful.
Old 05-31-2013, 01:47 PM   #3
David the H.
Bash Guru
 
Registered: Jun 2004
Location: Osaka, Japan
Distribution: Debian sid + kde 3.5 & 4.4
Posts: 6,823

Rep: Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957
*nix shell commands are generally very intolerant of mistakes. They assume you know what you are doing, and just do what you tell them to do. Most of them don't even give you any feedback when they succeed, and you'll only see messages if there are errors.

As for fixing your current problem, you can also run ls -l and look at the timestamps of the files. All the new ones should have the same date.
 
1 members found this post helpful.
Old 05-31-2013, 01:59 PM   #4
shivaa
Senior Member
 
Registered: Jul 2012
Location: Grenoble, Fr.
Distribution: Sun Solaris, RHEL, Ubuntu, Debian 6.0
Posts: 1,800
Blog Entries: 4

Rep: Reputation: 286Reputation: 286Reputation: 286
Quote:
Originally Posted by David the H. View Post
As for fixing your current problem, you can also run ls -l and look at the timestamps of the files. All the new ones should have the same date.
A minor addition, use ls -lat command to list out files. All latest modified files will be on the top of the list.
Code:
~$ ls -lat
Else, follow what druuna said above.

Last edited by shivaa; 05-31-2013 at 02:00 PM.
 
1 members found this post helpful.
Old 05-31-2013, 02:27 PM   #5
suicidaleggroll
LQ Guru
 
Registered: Nov 2010
Location: Colorado
Distribution: OpenSUSE, CentOS
Posts: 5,258

Rep: Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947
Like druuna said, just extract it in a clean directory, and then use that to clean out your home directory

Code:
mkdir new_folder
cd new_folder
unzip ../file.zip
for i in *; do rm "../$i"; done
That will make a clean folder, extract the zip in it, then use those filenames as a reference to remove all extracted files from the parent directory.


None of the "ls -l" or "ls -lat" suggestions will work because files extracted from a zip retain their original timestamps (from before the zip was made). The files he extracted from the zip could have timestamps dating back years.

Last edited by suicidaleggroll; 05-31-2013 at 02:31 PM.
 
2 members found this post helpful.
Old 05-31-2013, 02:38 PM   #6
shivaa
Senior Member
 
Registered: Jul 2012
Location: Grenoble, Fr.
Distribution: Sun Solaris, RHEL, Ubuntu, Debian 6.0
Posts: 1,800
Blog Entries: 4

Rep: Reputation: 286Reputation: 286Reputation: 286
Quote:
Originally Posted by suicidaleggroll View Post
None of the "ls -l" or "ls -lat" suggestions will work because files extracted from a zip retain their original timestamps (from before the zip was made). The files he extracted from the zip could have timestamps dating back years.
Good suggestion. Don't know how we missed that point.
 
Old 05-31-2013, 02:38 PM   #7
David the H.
Bash Guru
 
Registered: Jun 2004
Location: Osaka, Japan
Distribution: Debian sid + kde 3.5 & 4.4
Posts: 6,823

Rep: Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957
Quote:
Originally Posted by suicidaleggroll View Post
None of the "ls -l" or "ls -lat" suggestions will work because files extracted from a zip retain their original timestamps (from before the zip was made). The files he extracted from the zip could have timestamps dating back years.
Ah, I hadn't thought of that. I don't use archive formats like zip that often.

But you might still be able to tell them apart from the other files on your system, if the archive's creation times are different enough.
 
Old 05-31-2013, 03:26 PM   #8
Beryllos
Member
 
Registered: Apr 2013
Location: Massachusetts
Distribution: Debian
Posts: 304

Rep: Reputation: 121Reputation: 121
Edit/update: Sorry everyone. Parts of this (now colored blue) are incorrect. See the next posts for details.
Quote:
Originally Posted by suicidaleggroll View Post
Like druuna said, just extract it in a clean directory, and then use that to clean out your home directory

Code:
mkdir new_folder
cd new_folder
unzip ../file.zip
for i in *; do rm "../$i"; done
That will make a clean folder, extract the zip in it, then use those filenames as a reference to remove all extracted files from the parent directory.
Nice. Just one problem: "for i in *" does not handle filenames containing whitespace (for example, directory named "My Documents"), so it is better to use the find command, and pipe it to a read command in a while loop:
Code:
mkdir new_folder
cd new_folder
unzip ../package.zip
find . -maxdepth 1 -mindepth 1 | while IFS= read -r file; do rm -rv ~/"$file"; done
  • "-maxdepth 1" is necessary so that it does not search subdirectories
  • "-mindepth 1" prevents it from listing "."
  • I'm not sure whether (or in which situations) you need "IFS=" in this command. It clears the IFS ("internal field separator") variable for the read command. When I have tested it, I get the same result with or without IFS=
Caution: If the zip file contains files or directories with the same name as files or subdirectories previously existing in your home directory, this method (or that of suicidaleggroll quoted above) will delete your original files, if they were not already overwritten by the unzip command.

Last edited by Beryllos; 05-31-2013 at 07:02 PM. Reason: wanted to alert the reader to an inaccuracy
 
Old 05-31-2013, 05:09 PM   #9
suicidaleggroll
LQ Guru
 
Registered: Nov 2010
Location: Colorado
Distribution: OpenSUSE, CentOS
Posts: 5,258

Rep: Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947
Quote:
Originally Posted by Beryllos View Post
Nice. Just one problem: "for i in *" does not handle filenames containing whitespace (for example, directory named "My Documents")
Yes it does

Code:
$ touch file1
$ touch "file 2"
$ mkdir "dir 3"
$ mkdir "My Documents"
$ ls
dir 3  file1  file 2  My Documents
$ for i in *; do echo $i; done
dir 3
file1
file 2
My Documents
$ for i in *; do rm -fr "$i"; done
$ ls
$
Doing something like:
Code:
files=$(ls *)
for i in $files
Would fail, but globbing directly in the for loop works fine.

Last edited by suicidaleggroll; 05-31-2013 at 05:26 PM.
 
1 members found this post helpful.
Old 05-31-2013, 06:57 PM   #10
Beryllos
Member
 
Registered: Apr 2013
Location: Massachusetts
Distribution: Debian
Posts: 304

Rep: Reputation: 121Reputation: 121
Quote:
Originally Posted by suicidaleggroll View Post
Yes it does
Please accept my apologies. You are right, of course. I was confusing it with the following:
Code:
for file in `find . -plus-other-find-functionality`
do
rm -r $file
done
which is a completely different thing (and doesn't work).

I think it would be safe to say that the method I outlined allows additional filtering with options of the find command, when that is what you need.

Last edited by Beryllos; 05-31-2013 at 07:03 PM.
 
Old 06-02-2013, 06:47 AM   #11
David the H.
Bash Guru
 
Registered: Jun 2004
Location: Osaka, Japan
Distribution: Debian sid + kde 3.5 & 4.4
Posts: 6,823

Rep: Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957
That's correct. The word-splitting problem only affects parameter or command expansions (those that start with $ or `). This is due to the shell parsing order; IFS-based word-splitting is done right after these expansions. Globbing expansion happens after the word-splitting step is completed, and therefore isn't affected. For loops are recommended for use on globbing patterns.

This, on the other hand, carries a slight risk to it.
Code:
find . -maxdepth 1 -mindepth 1 | while IFS= read -r file; do rm -rv ~/"$file"; done
By default, read delimits strings on newline characters. But it's not impossible for a filename itself to contain newlines too (rare, true, but not impossible). So to be completely safe you need to delimit the input with null separators.

Code:
while IFS= read -r -d '' file; do rm -rv "$HOME/$file"; done < <( find . -maxdepth 1 -mindepth 1 -print0 )
(I recommend using process substitution, if your shell supports it, instead of the pipe, and the $HOME variable instead of "~" too. )

But if you're using find anyway, you can often skip the loop entirely and either run the desired commands with an -exec argument, or with xargs. If the latter, be sure to use its -0 null separator input option as well. gnu find also has a dedicated -delete option you can use directly.

Code:
find . -maxdepth 1 -mindepth 1 -print0 | xargs -0 rm -rv
find . -maxdepth 1 -mindepth 1 -exec rm -rv '{}' +
find . -maxdepth 1 -mindepth 1 -delete
 
1 members found this post helpful.
Old 06-02-2013, 02:51 PM   #12
Beryllos
Member
 
Registered: Apr 2013
Location: Massachusetts
Distribution: Debian
Posts: 304

Rep: Reputation: 121Reputation: 121
Quote:
Originally Posted by David the H. View Post
But if you're using find anyway, you can often skip the loop entirely and either run the desired commands with an -exec argument, or with xargs. If the latter, be sure to use its -0 null separator input option as well. gnu find also has a dedicated -delete option you can use directly.
Oh, no! This is second or third time I have forgotten about -exec and piped the output of find into a while-read loop... and written it up in a forum post! :(

Thanks for taking the time to educate me. I'll work on learning and playing around with all the points you mentioned.
 
Old 06-02-2013, 04:15 PM   #13
David the H.
Bash Guru
 
Registered: Jun 2004
Location: Osaka, Japan
Distribution: Debian sid + kde 3.5 & 4.4
Posts: 6,823

Rep: Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957Reputation: 1957
Heh, don't sweat it. Things like that happen to everyone now and then. You did your best to be helpful.


And to be fair, exec/xargs are really designed for single, simple commands, and sometimes you really need to use a loop to handle a more complex series of actions. But even then you're better off creating a stand-alone script for your actions instead and run that as your -exec argument. The script itself would usually contain a simple for loop to process the input parameters it gets from find.

myscript.sh:
Code:
#!/bin/bash

for filename; do    #it loops over the positional parameters by default

    <commands for $filename>

done

exit 0
Execute it:
Code:
find . -type f -exec /path/to/myscript.sh '{}' +
 
  


Reply

Tags
zip


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Undoing a mke2fs command/file recovery linuxpicaxe Linux - Software 3 02-22-2011 06:07 PM
searching for a previous command s2sum2b Linux - Newbie 5 12-14-2010 07:58 AM
use output of previous command as parameter to another command raviluchmun Linux - Newbie 4 11-14-2010 01:35 AM
[SOLVED] undoing a mistake on the command line... oznola Linux - Newbie 2 05-15-2010 09:17 AM
[SOLVED] redirecting STDOUT of previous command to STDIN of next command urban.yoga.journeys Linux - General 9 11-22-2009 04:16 AM


All times are GMT -5. The time now is 10:39 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration