Hi everyone,
I am running a series of simulations on a cluster node. The output of the simulations are some pretty large data files, so I would like to transfer them to the storage server as soon as they are made. I would like to do this in a bash script (there are 68 simulations being run, two at a time, each lasting 2-3 hours so I do need to automate).
The problem is, I can't ssh or scp from the node to the storage server.
(Yes, I have set up and ssh identity file in my home dir on the cluster head and the authorization_key in the home dir on the storage server, so that I am not being asked for password when I scp through the script).
Since node doesn't communicate with the outside world, I am assuming I need to go back to the head and scp the file, then go back to the node to do another simulation. All this in a script. Can anyone help me with this? I tried
Code:
logout #back to the head
scp files storage_server
rsh node10 #back on the node
cd wherever/i/need/to/be/on/the/node
This is put in a loop, so it's done everytime simulation finishes.'logout' doesn't work cause the shell I am in on the node is not a login shell. So I need to use 'exit', but (since exit is a keyword for terminating the loop) that will get me out of the 'for i in ...' loop prematuraly. See what I'm saying? Kinda tricky.
Any help would be appreciated.