LinuxQuestions.org
LinuxAnswers - the LQ Linux tutorial section.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices

Reply
 
Search this Thread
Old 11-16-2012, 10:43 AM   #1
Ztcoracat
Senior Member
 
Registered: Dec 2011
Distribution: CentOS & Linux Mint
Posts: 3,400
Blog Entries: 1

Rep: Reputation: Disabled
Bash: Stderr & Stdout 2 file


Hi:

I was able to understand stdout 2 stderr
Example: grep da * 1 > & 2
And the example:
grep * 2 > & 1....from reading this page:
http://www.tldp.org/HOWTO/text/Bash-Prog-Intro-HOWTO

When I had gotten to this example in this example:
stderr and stdout 2 file; I was completely bewildered-
I know rm is remove and (-name core) would be the name of an application but I'm confused-
Code:
rm -f $(find / -name core) & > /dev/null
When and why would I use this command?
Does null mean 'pass in absolute silence?
And; where (-name core) is that where I would type in the name of the application?

Also; the page says "This will place every output of a program to a file. This is sometimes suitable for cron entries (if you want a cmd to pass in absolute silence."

Would the file this is referring to be a file that is associated with a cron job? Not to be confused with the script that would be written for the cron job to run? Right?
 
Old 11-16-2012, 11:16 AM   #2
catkin
LQ 5k Club
 
Registered: Dec 2008
Location: Tamil Nadu, India
Distribution: Servers: Debian Squeeze and Wheezy. Desktop: Slackware64 14.0. Netbook: Slackware 13.37
Posts: 8,563
Blog Entries: 29

Rep: Reputation: 1179Reputation: 1179Reputation: 1179Reputation: 1179Reputation: 1179Reputation: 1179Reputation: 1179Reputation: 1179Reputation: 1179
-name core tells find to look for files and directories named core.

The command could be routinely used to remove program's core (memory content) dumps which are conventionally named core and are only useful if you know how to examine them to find the cause of the program crashing.

/dev/null is a special file. Anything written to it is discarded. When you read from it you get an endless stream of ASCII NUL characters.

&> redirects the standard output (stdout) and standard error (stderr) to the file that follows &>. AFAIK it does not send "every output" but few programs output more than stdout and stderr.

The technique of redirecting stdout and stderr to /dev/null is useful for cron jobs because cron intercepts any output and helpfully tries to make it available, typically by mailing it to the user that it ran the job for.
 
Old 11-16-2012, 12:00 PM   #3
rknichols
Senior Member
 
Registered: Aug 2009
Distribution: CentOS
Posts: 1,599

Rep: Reputation: 670Reputation: 670Reputation: 670Reputation: 670Reputation: 670Reputation: 670
Quote:
Originally Posted by catkin View Post
/dev/null is a special file. Anything written to it is discarded. When you read from it you get an endless stream of ASCII NUL characters.
Actually, that would be /dev/zero. Reading from /dev/null returns EOF, not an ASCII NUL. Writes to either /dev/null or /dev/zero are discarded.
 
1 members found this post helpful.
Old 11-16-2012, 12:09 PM   #4
catkin
LQ 5k Club
 
Registered: Dec 2008
Location: Tamil Nadu, India
Distribution: Servers: Debian Squeeze and Wheezy. Desktop: Slackware64 14.0. Netbook: Slackware 13.37
Posts: 8,563
Blog Entries: 29

Rep: Reputation: 1179Reputation: 1179Reputation: 1179Reputation: 1179Reputation: 1179Reputation: 1179Reputation: 1179Reputation: 1179Reputation: 1179
Quote:
Originally Posted by rknichols View Post
Actually, that would be /dev/zero. Reading from /dev/null returns EOF, not an ASCII NUL. Writes to either /dev/null or /dev/zero are discarded.
Oops. Thanks for the correction
 
Old 11-16-2012, 12:51 PM   #5
Ztcoracat
Senior Member
 
Registered: Dec 2011
Distribution: CentOS & Linux Mint
Posts: 3,400
Blog Entries: 1

Original Poster
Rep: Reputation: Disabled
Thanks for answering my thread; this is hard for me to understand-

With the technique of redirecting stdout and stderr to:
/dev/null (which if I understand should be) /dev/zero....is only useful for cron jobs and cron jobs only?

And you mentioned; "mailing it to the user that it ran the job for" Please elaborate so I can comprehend this if you could with my example:
Code:
rm -f $find / rkhunter core & > /dev/zero
Anything with rkhunter core would than be deleted?....Right?

EOF you do mean end of file right?
 
Old 11-16-2012, 04:40 PM   #6
rknichols
Senior Member
 
Registered: Aug 2009
Distribution: CentOS
Posts: 1,599

Rep: Reputation: 670Reputation: 670Reputation: 670Reputation: 670Reputation: 670Reputation: 670
Quote:
Originally Posted by Ztcoracat View Post
With the technique of redirecting stdout and stderr to:
/dev/null (which if I understand should be) /dev/zero....is only useful for cron jobs and cron jobs only?
Discarding output would normally be done by redirecting to /dev/null. You could get the same (non-)result by redirecting to /dev/zero, but I can't think of any reason to do that other than being deliberately confusing. Discarding of output is most commonly done in scripts, where a command is being run for a purpose other than displaying that output. On occasion, it's useful in an interactive session as well. For example, if you wanted to find all the files under /etc that contain a reference to "localtime", you might run
Code:
grep -r -l localtime /etc
But, if you do that as a non-root user the output will be cluttered by complaints about files and directories for which you lack read permission. To see just the output for places where you do have read permission and ignore the errors, you discard stderr:
Code:
grep -r -l localtime /etc 2>/dev/null
Quote:
Originally Posted by Ztcoracat View Post
And you mentioned; "mailing it to the user that it ran the job for" Please elaborate so I can comprehend this if you could with my example:
Code:
rm -f $find / rkhunter core & > /dev/zero
Anything with rkhunter core would than be deleted?....Right?
As written, that line has so many errors that it's not going to do anything like what you thought it might, and I have no idea where "rkhunter" came into the picture. You were a bit closer in your original posting:
Code:
rm -f $(find / -name core) & > /dev/null
First of all, "& >" (with a space between the two characters) is not the same as the single token "&>" (no space). With the space, the single "&" says to run the preceding command in the background and take the rest of the line as a new command, "> /dev/null", which says to run nothing and redirect its output (of which there would be none) onto /dev/null.

The original code from the HOWTO isn't much better:
Code:
rm -f $(find / -name core) &> /dev/null
OK, that's at least syntactically correct, discarding both stdout and stderr from the rm command (Of course, the rm command never writes anything to stdout anyway, but whatever.). Also, stderr from the find command is not redirected, so you could still see complaints about unreadable directories and the like. But, the big problem here is that the output from find is substituted in a way that would make any directory with an embedded space character in its name be passed as two separate arguments to rm. Any shell wildcard characters in a directory name would also be a problem. I'd send a nastygram to the author of that document, but the document is over 12 years old, and the author is probably long gone.

Now the default in a cronjob is to collect any output that, in an interactive session, would have gone to the terminal and, if there is any such output, send it in an email message to the user that scheduled the job. So in this case, any complaints from rm would be discarded due to the redirection, but if there were any messages from find, then a message would be sent.

Quote:
Originally Posted by Ztcoracat View Post
EOF you do mean end of file right?
Yes. In a C program using stdio, the manifest constant is named "EOF" and is defined as -1.
 
1 members found this post helpful.
Old 11-18-2012, 07:38 AM   #7
Ztcoracat
Senior Member
 
Registered: Dec 2011
Distribution: CentOS & Linux Mint
Posts: 3,400
Blog Entries: 1

Original Poster
Rep: Reputation: Disabled
Rknichols:

The examples helped me to understand; thank you-

Thanks for the confirmation of "Where I would have read permissions and ignore the errors you discard stderr"
I didn't know-
Also did not know about wildcard characters in a directory name would be a problem-

You said that the default in a cronjob is to collect any output that in an interactive sission would of gone to the terminal (if any output) and then sent to the user that scheduled it-

If there is output that let's say is a 'false positive result' (just an example) (a root kit found from a cronjob running)
than the user would be informed of the output via e-mail?

Making sure I understand as this is not exactly easy to learn; I'm glad for the help-
 
Old 11-18-2012, 09:43 AM   #8
rknichols
Senior Member
 
Registered: Aug 2009
Distribution: CentOS
Posts: 1,599

Rep: Reputation: 670Reputation: 670Reputation: 670Reputation: 670Reputation: 670Reputation: 670
Quote:
Originally Posted by Ztcoracat View Post
Rknichols:You said that the default in a cronjob is to collect any output that in an interactive sission would of gone to the terminal (if any output) and then sent to the user that scheduled it-

If there is output that let's say is a 'false positive result' (just an example) (a root kit found from a cronjob running)
than the user would be informed of the output via e-mail?
If the programs or scripts being run as part of a cronjob write anything to stdout or stderr, that output will (by default) be sent in an email message to the user that scheduled the cronjob. Now some programs do alter their behavior when output is redirected somewhere besides a terminal**, but for the most part that email message will contain whatever would have shown up on the terminal.
**For an example of this, compare the output of running
Code:
ls
vs.
Code:
ls | cat
 
1 members found this post helpful.
Old 11-18-2012, 10:04 AM   #9
Ztcoracat
Senior Member
 
Registered: Dec 2011
Distribution: CentOS & Linux Mint
Posts: 3,400
Blog Entries: 1

Original Poster
Rep: Reputation: Disabled
I compared the 2 output's like you suggested:

Code:
ztcoracat@:~$ ls | cat
Desktop
Documents
Downloads
dwhelper
Music
Pictures
Public
Templates
tmp.key
Videos
vimperatorrc
Code:
ztcoracat:~$ ls
Desktop    Downloads  Music     Public     tmp.key  vimperatorrc
Documents  dwhelper   Pictures  Templates  Videos
I see the difference in format yet the output is the same.

Thanks
 
Old 11-18-2012, 10:23 AM   #10
Ztcoracat
Senior Member
 
Registered: Dec 2011
Distribution: CentOS & Linux Mint
Posts: 3,400
Blog Entries: 1

Original Poster
Rep: Reputation: Disabled
I'll continue to read; study; and learn as much as possible about Bash and Bash scripting.
If I can just get down the principles and the basic's of Bash I'll be satisfied but I don't think I'm there yet.

I'm already having trouble with this page in regard to Looping to Reapeat Commands
http://rute.2038bug.com/index.html.gz
It's my hope to grasp these practices to be able to write my first script.

Back to work

Thanks again!

Last edited by Ztcoracat; 11-18-2012 at 10:34 AM.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
stdout & stderr background process questions nandha.v Linux - General 4 05-11-2012 09:03 AM
bash output stderr to 2 files and stdout vascot Linux - Software 1 04-05-2011 09:04 AM
Perl stderr&stdout redirect question Fredde87 Programming 5 03-26-2009 05:43 AM
bash logging stdout plus stderr freeindy Programming 6 01-29-2009 10:16 AM
Shell script - stdout & stderr to terminal and file jantman Linux - Software 1 12-07-2006 05:34 PM


All times are GMT -5. The time now is 08:57 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration