LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 08-01-2015, 04:14 PM   #1
/dev/dog
Member
 
Registered: Nov 2014
Location: US
Distribution: Debian
Posts: 39

Rep: Reputation: 0
Question next command starts before one finishes


Hi! You guys are awesome!

Anyway, I am learning much just by fooling around with VMs.

I am confused about this output.

Code:
alpha@dogpack:~$ find / &> /dev/null & echo "---Current-Jobs---" ; jobs ; pgrep "find" | xargs kill && jobs
[1] 1992
---Current-Jobs---
[1]+   Running                     find / &> /dev/null &
[1]+   Running                     find / &> /dev/null &
alpha@dogpack:~$
For newbies noobier than I viewing this thread, I'll explain what this all means in the second reply to this thread.
My question is, am I correct in assuming that the last command
Code:
jobs
executed before the second-to-last command
Code:
pgrep "find" | xargs kill
finished?
If so, how can I make it so that the last 'jobs' happens only after the second-to-last command finishes?

Thank you all in advance
 
Old 08-01-2015, 04:24 PM   #2
/dev/dog
Member
 
Registered: Nov 2014
Location: US
Distribution: Debian
Posts: 39

Original Poster
Rep: Reputation: 0
For people who want to figure out what the above command means.

Code:
find / &> /dev/null & echo "---Current-Jobs---" ; jobs ; pgrep "find" | xargs kill && jobs
Code:
find /
looks at and prints every file and folder and files in subfolders that it encounters.
Code:
$>
all standard output and standard error are sent to the location after
Code:
/dev/null
a pseudo-location that destroys everything sent to it
Code:
&
sends the previous command to the background, allowing the terminal to continue accepting commands
Code:
echo "---Current-Jobs---"
prints "---Current-Jobs---" to the screen
Code:
;
denotes the end of one command and allows a new command to begin
Code:
jobs
lists all jobs and their statuses
Code:
pgrep "find"
gets the list of all processes, and filters for the term "find"
Code:
 |
sends the standard output of the previous command to the standard input of the next
Code:
xargs
converts standard input into arguments/parameters
Code:
kill
terminates a process
Code:
pgrep "find" | xargs kill
pgrep "find" returns a process ID of the 'find' command I executed
then the pipe (|) sends that as standard input to xargs
xargs converts that standard input to 'kill's argument to kill
Code:
&&
runs the next command is the previous was successful

my confusion is that the last 'jobs' printed the last line of the output, which dictates that the find command is still running.
I thought that the previous command terminates that process.
 
Old 08-01-2015, 08:39 PM   #3
rknichols
Senior Member
 
Registered: Aug 2009
Distribution: Rocky Linux
Posts: 4,777

Rep: Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212
That last jobs command does run after the kill command completes. Your problem is that the kill command just sends a signal which is then queued for the target process. Nothing happens to that process until the next time it is scheduled, sees the signal, performs whatever cleanup actions are needed, and finally terminates. Even then, it remains as a zombie process until its parent reaps its status with a wait() call. You have no control over when your final jobs process gets scheduled in that sequence.
 
1 members found this post helpful.
Old 08-02-2015, 04:36 AM   #4
/dev/dog
Member
 
Registered: Nov 2014
Location: US
Distribution: Debian
Posts: 39

Original Poster
Rep: Reputation: 0
Thumbs up

Quote:
Originally Posted by rknichols View Post
That last jobs command does run after the kill command completes. Your problem is that the kill command just sends a signal which is then queued for the target process. Nothing happens to that process until the next time it is scheduled, sees the signal, performs whatever cleanup actions are needed, and finally terminates. Even then, it remains as a zombie process until its parent reaps its status with a wait() call. You have no control over when your final jobs process gets scheduled in that sequence.
I see, well, that makes sense. Thank you very much.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] su - $user -c "command" starts shell instead of running command AlucardZero Linux - Software 3 08-28-2012 03:35 PM
Remote printer starts to spool job but never finishes war1025 Linux - Hardware 2 07-17-2009 12:08 AM
Remote printer starts to spool job but never finishes war1025 Debian 1 07-13-2009 04:28 PM
Run a command when GDM starts exodist Linux - Software 1 02-18-2009 10:55 PM
How to specify what starts when startx command charlie.babitt Slackware 2 05-22-2004 03:28 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 12:21 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration