Capture reponse of 20 different URL hits in single shell script
I need to execute 20 urls in one shell script and display there responses on the console and write on text file too.
Please consider the case when url is not responding. Please help. Thanks in advance.. |
wget -i urls.txt -q -O - | tee output.txt
urls.txt contains the urls, one per line This will write all results concatenated to the one file, output.txt, as well as displaying them on the screen. Wgets progress messages should not show. Errors will show but won't be in output.txt |
It is a duplicate thread and reported.
|
Hi cantab,
Can u help me in the script mention by you that if url is not reponding then it will display output as "Not connected" in Red color and if url is responding it shows output as "OK" in green color. Thnaks in advance.. Quote:
|
Please post your thread in only one forum. Posting a single thread in the most relevant forum will make it easier for members to help you and will keep the discussion in one place. This thread is being closed because it is a duplicate.
http://www.linuxquestions.org/questi...4/#post4056588 |
All times are GMT -5. The time now is 07:29 PM. |