LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Newbie (https://www.linuxquestions.org/questions/linux-newbie-8/)
-   -   Quik SSH question (https://www.linuxquestions.org/questions/linux-newbie-8/quik-ssh-question-312900/)

rjkfsm 04-13-2005 02:54 PM

Quik SSH question
 
If I start an executable script from an SSH session, and redirect the output to a null device, am I not supposed to get a prompt back?

RK

david_ross 04-13-2005 03:06 PM

Not if the script is still running.

You can background it with:
/path/to/script > /dev/null &

Or press Ctrl+z when it is running.

MA_D 04-13-2005 04:23 PM

Unless it prints to stderr. There is stdout and stderr. If you wanna make absolutely sure your redirect gets all console text you must do this:

script.sh 2>&1 >> /dev/null

2>&1 can also be stderr>stdin I think.

jonaskoelker 04-13-2005 06:48 PM

to MA_D: I'm sorry, but you're slightly wrong;
$ foo.sh 2>&1 >> /dev/null #*does* work, but it's a bit clumsy
$ foo.sh &> /dev/null #also works; &> means `redirect stderr and stdout to'
$ foo.sh 2>&1 # redirects stderr to stdout (or, transitively, to where stdout goes).
You can't redirect from a program's stdout/stderr to its stdin; I've tried some hackery with pipe files, and although they are very neat, they don't redirect to stdin;

however, I have found
$ mknod pipe p
$ cat < pipe > pipe
a useful tool. It enables you to write to a pipe without having to wait for the data to be consumed.

also, whether or not a process produces output (to either &1 or &2) shouldn't affect how it goes into the background.

for more, read `advanced bash scripting guide' (or your preferred tutorial).

hth --Jonas

rjkfsm 04-13-2005 07:09 PM

Grrrrrrrrrrrrrrrreat

Well, none of these solved the problem because when I close the SSH session, the script gets killed. Bash ends up being a child process of SSHD and the script is a child process of bash. Close out the session and SSHD kills it's bash process which kills the script.

I found a script that I could modify and can be put in the init.d folder and when run from boot up, it works........ So, I am happy.

Thank you to you all for your time. I learned a lot.

RK

jonaskoelker 04-13-2005 07:21 PM

np. Also, to google for: zombified process. iirc, this happens when a process loses its parent; it might contain info relevant to your situation.

MA_D 04-14-2005 11:23 AM

Quote:

Originally posted by jonaskoelker
to MA_D: I'm sorry, but you're slightly wrong;
$ foo.sh 2>&1 >> /dev/null #*does* work, but it's a bit clumsy
$ foo.sh &> /dev/null #also works; &> means `redirect stderr and stdout to'
$ foo.sh 2>&1 # redirects stderr to stdout (or, transitively, to where stdout goes).
You can't redirect from a program's stdout/stderr to its stdin; I've tried some hackery with pipe files, and although they are very neat, they don't redirect to stdin;

however, I have found
$ mknod pipe p
$ cat < pipe > pipe
a useful tool. It enables you to write to a pipe without having to wait for the data to be consumed.

also, whether or not a process produces output (to either &1 or &2) shouldn't affect how it goes into the background.

for more, read `advanced bash scripting guide' (or your preferred tutorial).

hth --Jonas

Ok, change every occurance of stdin in my post to stdout and I'm right :-p. Anyway, 2>&1 does the job, I use it almost daily.

jonaskoelker 04-14-2005 11:40 AM

s/stdin/stdout/g: accepted. I'm sure 2>&1 > /foo/bar does the job, I'm just trying to show how &> can save you a lot of typing -- this is what I like to call `constructive laziness', a virtue in the field of computer science ;)


All times are GMT -5. The time now is 04:40 AM.