Expect: Handling multiple spawned processes from a single script
I am trying to work with Expect scripts and iam trying to write a script to do the following (based on the info that we can handle multiple processes within a single expect script):
1. From the mainpc (running the expect script), spawn 2 SSH connections/processes - one to pc1 and another to gw1 - the ssh connections are opened and supposed to run simultaneously from mainpc
2. On PC3, a udp-echo server is started and running already. This is done manually and no script is involved
3. Now on pc1, after ssh login, start a udp-echo client session to continuosly send traffic to remote udp-echo-server thru the Gw1. This traffic is to be sent continuously. Hence this ssh session from mainpc is not supposed to be disconnected till it is explicitly exited by the script from mainpc.
4. Simultaneously, on GW1, after ssh login, while the udp traffic between pc1 and pc2 is passing thru the Gw, do some action such as say, start reading the conntrack connections file using the command "cat /proc/net/ip_conntrack" and check for the pattern, say, for example "udp...ESTABLISHED..." for verifying the udp connection establishment for the traffic passing thru the GW1 and Also some more verification steps on GW1. This ssh session from mainpc should remain open
5. Now come back to PC1, stop the udp-echo-client traffic and now start a tcp connection to pc2, such as a ssh connection, or a ftp connection, etc.
6. So now going back to GW1 again, check using the already opened ssh connection, the ip_conntrack file for the "tcp...ESTABLISHED..." pattern and other verification steps
7. Now Going back to PC1 ssh connection, stop the traffic being sent, logout of the ssh session, and close the spawned ssh-connection process to pc1 in the script
8. Next exit/logout of the ssh connection to GW1, and close this spawned process too
9. write a PASS result for this test in a file and then start a similar iteration once again for a different test-result
192.168.0.200/24) | (192.168.1.2/24)
#On the lnx-fdr-mainpc
set timeout -1
# for debugging only
# first spawn the 2 processes
set pc1id $spawn_id
SET gw1id $spawn_id
#start the ssh connections
send -i $pc1id "ssh 192.168.0.200\r"
send -i $gw1id "ssh 192.168.0.1\r"
#process the initial ssh setup on pc1
expect -i $pc1id "password: "
send -i $pc1id "config123\r"
expect -i $pc1id ".*root*"
# process the initial ssh setup on gw1
expect -i $gw1id "password: "
send -i $gw1id "config123\r"
expect -i $gw1id ".*root*"
# now start executing the commands on pc1
....<send a command for starting the udp-echo client traffic on pc1>.....
....<expect some traffic pattern output for above command on PC1 in the ssh connection session
# simulataneously, check for the successfull connection
# establishement on GW1
....<send a command to read the ip_conntrack file using "cat /proc/net/ip_conntrack" command
----< expect for ESTABLISHED ...pattern in the file...>
# again go back to pc1, do something else
# back to gw1 and check the next result in some action
# exit out of the pc1 ssh session and close the spawned #process
send -i $pc1id "exit\r"
close -i $pc1id
# exit out of gw1 ssh session and close the spawned process
send -i $gw1id "exit\r"
close -i $gw1id
puts "This is the end of this test iteration\n"
Now i have started of with a simple steps, and iam observing the following issues:
1. The script seems to stop/hang after a ssh connection is opened to pc1 and i am only observing the "pc1@root#" prompt
2. It is not moving ahead with the other required operation of openning another ssh connection to GW1...and the other subsequent operations required for this test
3. Where am i going wrong with spawning and handling multiple processes?
4. My mainpc and other pc's are running linux-fedora-13 (and i installed expect and tcl using "yum install expect*" and "yum install tcl*" from the internet)
5. Or am i getting too ambitious in wanting to do everything thru a single script. But that' the idea behind automation right?
i would be very greatful for all your help. Please do forgive me for such a lengthy post.