LinuxQuestions.org
Visit Jeremy's Blog.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 01-27-2004, 11:29 AM   #1
webamoeba
LQ Newbie
 
Registered: Nov 2003
Posts: 18

Rep: Reputation: 0
Scripting problems - I'm sure this should be easy!


I have a script which first checks that \bin\tftp exists, then downloads two files from a TFTP server. Iam trying to include a way of ensuring that the files downloaded successfully by using a function - check_file. The trouble is, if the file fails to download from TFTP I get a timeout message and the TFTP prompt. I am distributing this script using ZENWorks so it musn't leave machines like that!!!! especially when there are over 200 machines! Can I perhaps put TFTP into silent mode???? Or maybe have the echo TFTP line in an if [] statement??? NE help appreciated, thanks. (am new to linux so don't scare me, lol)
Code:
error_message () {
	setterm -ulcolor red
	setterm -underline on
	echo $1
	sleep 30s
	setterm -underline off
}

check_file () {
	if [ -e $1 ]
	then
		cp $1 $2
		chmod 700 $2
		rm $1
	else
		error_message "FILE TRANSFER ERROR - FAILED"
	fi
}

if [ -e "/bin/tftp" ]
then
	echo get util/pass/util.s | tftp X.X.X.X
	check_file "util.s" "/bin/util.s"

	echo get tftp/tftp.s | tftp X.X.X.X
	check_file "tftp.s" "/bin/tftp.s"
else
	error_message "TFTP CLIENT MISSING - FAILED"
fi
Thanks
 
Old 01-27-2004, 11:48 AM   #2
crabboy
Senior Member
 
Registered: Feb 2001
Location: Atlanta, GA
Distribution: Slackware
Posts: 1,821

Rep: Reputation: 121Reputation: 121
Is tftp a requirement?

Try using ncftpget. It's the command line version of ncftp and has a great set of return codes if the download should fail.
Code:
ncftpget -u username -p password ftp://hostname/filename
It will return: (from the ncftpget man pages)
Code:
      ncftpget returns the following exit values:

       0       Success.

       1       Could not connect to remote host.

       2       Could not connect to remote host - timed out.

       3       Transfer failed.

       4       Transfer failed - timed out.

       5       Directory change failed.

       6       Directory change failed - timed out.

       7       Malformed URL.

       8       Usage error.

       9       Error in login configuration file.

       10      Library initialization failed.

       11      Session initialization failed.
 
Old 01-27-2004, 03:10 PM   #3
Eqwatz
Member
 
Registered: May 2003
Distribution: Slack Puppy Debian DSL--at the moment.
Posts: 341

Rep: Reputation: 30
You can then redirect the returned value; use Case to echo more informative (and custom) error messages, and other functions like a fail-over (a secondary site which mirrors the first). There is never an instance where it is necessary to leave the machine in an unusable condition.

You can in fact, set up your script to generate md5 checksums and use a test statement to compare the value of the md5 checksum (which you provide as same_file_name.check)) and the checksum generated by the script for the downloaded file. This gives you a boolean true or false to work with to determine what the script is to do under that circumstance.
 
Old 01-28-2004, 06:34 AM   #4
webamoeba
LQ Newbie
 
Registered: Nov 2003
Posts: 18

Original Poster
Rep: Reputation: 0
Nope can't use ncftpget.

But have come up with a soilution! it's a bit rough but works OK.
Code:
error_message () {
	setterm -ulcolor red
	setterm -underline on
	echo $1
	setterm -underline off
}

check_file () {
	if [ -s $1 ]
	then
		cp $1 $2
		chmod 700 $2
		rm $1
	else
		error_message "FILE TRANSFER ERROR - FAILED"
	fi
}

tftp_get () {
	echo get $1 | tftp 10.6.176.18
}

if [ -e "/bin/tftp" ]
then
	echo quit | tftp_get "file_to_get"
	check_file "file_name" "file_destination"
else
	error_message "TFTP CLIENT MISSING - FAILED"
fi
 
Old 01-28-2004, 10:08 AM   #5
Eqwatz
Member
 
Registered: May 2003
Distribution: Slack Puppy Debian DSL--at the moment.
Posts: 341

Rep: Reputation: 30
Why are you: rm $1?

Ok. I'm a little bit paranoid, make sure that any errors or failures that you can think of will return some sort of message either to a log, standard out, or as a generated e-mail to the administrator (you). The last is much preferred.

You initialize a tempfile then >> (redirect+concatenate-which is to add the next item on a new line at the end of the file.) Even though it adds to the size of the script, always interpret error messages in an easy-to-understand way.
You can, in fact, by using "escape characters" echo human-readable COMMENTS to your log-file and still leave it parse-able by scripts.
Then, add a script function to e-mail all pertinent information back to you. And, don't forget to clean-up and remove the tempfile after all is done, success or failure.
(It is best to use a random number generator to make sure the tempfile_name is unique so you don't clobber some other script function you may have or may write in the future. Don't trust that every environment of every machine will do this for you.)

You can set up a special e-mail account for this, with a script which parses the generated message for the: host, group, region, location, user, the success/error messages; use this to create an administrative log for each specific machine. Then discard the e-mail itself after processing it.

Set it up to create a file if one doesn't exist--or--concatenate messages to an existing file with a generated solid line or some sort of easily identified break between entries and some sort of identification for each of the installations. Most likely this is not going to be a one time deal, always try to create a tool which can be used again with no editing for each use. **

Having a broken line or other easily identified break means that you can use variables as part of the script to keep track and speed up automated searches of each log-file. Then, one file within the directory you set up will have unique entries for all of the types of breaks: policies, software-updates, patches, software, settings, private-public keys, vpn-IDs, changed-Admin-passwords, changed-user-passwords, anything you want to keep track of; using line numbers, generated IDs, and "friendly" human readable names; this you can use to quickly check any identified machine which you are responsible for--for any problems or installation information. Each machine can be sorted by group or any other information you choose to use in the header of the logfiles, and each machine will have its own log-file.***

These "breaks" can be made unique pretty easily for each kind of task. All you need is a unique pattern which can be matched using Regular Expressions. (You embed the DATE/time in the breaking line after enough characters to ensure that identity is unique for your intended use. I'll tell you this, the use of Regular Expressions is explained by "Rute User" better than any programming manual I have ever read. I have read many.

The tasks to which I refer can be: scripts for generating private keys--which then can be uploaded to you using your public key for encryption so they will remain secure (along with the specific machine public-key), which then enables you to securely change administrative passwords, vpn keys, secure-shell keys, new encryptions/logins/passwords for wire-less networking, MAC address assignments, and other stuff you need to do.

Automated searching of the machine_specific_log-files you create can enable you to create scripts to change: every machine and user-specific network login, encryption, services, settings, software, group-memberships, priveleges, users, available server and network resources, anything you can do physically to the machine with the exception of adding and removing hardware; remotely from your workstation, and as securely as if you were doing it at the keyboard on the machine itself.

Why? because each machine has it's own encryption, which is imported by the script you write, and no unencrypted communication goes over any internal or external network.

You literally can change the whole network--including windows machines--with one command from your administrative console, without having to look up anything. That is the power of a bash script.
And, of course, all of the new machine-specific encryptions and information would be appended to the specific log for each machine/programmable device.

Planning the layout of the log-file is important, any information you choose to include in the header can be used to generate reports and sorted information by whatever entries you choose to include in the header.****


NOTE: Size alone is not sufficient to verify a file transfer. I know this from bitter experience. If a file is tar-ed/compressed/packed/zipped/bzipped/whatever then there is a check-sum value in the header to verify the file integrity. That is why .iso files always have an MD-5 check-sum available on the server to check integrity. It is always good policy to distribute files with some means of verification. You can automate the unpacking and/or verification as part of the script.

The way you can do this is: create another tempfile-with a corresponding variable_name; download the bzipped file to the tempfile; attempt to unzip it to the location you wish for the final location; process the returned success or failure/error value of the utility within your bash-script. It would be a Very Good Idea (tm) to include in your bash-script a test for which type of compression is used on the object you want to download from the server and to call the specific program to de-compress it; this avoids the kind of errors I make which drive me up the wall. (Sometimes I wish I could fire myself.)

/** Take it one step at a time; keep each task simple, with a means of processing failure, and include enough comments/documentation to be able to maintain each segment of your script separately. Complexity is built by following one task with another. A script IS a program--whether it is in Windows or Linux. **/

/*** I wouldn't use tabs or special characters--they can cause un-intended results. It also makes it useful no matter what operating system you may be running--always make things portable if possible. ***/

/**** Remember, if you import a set of files from windows, you have to strip extra characters that windows puts on the end of every line. Linux/Unix just uses a "new-line" character, windows adds a carriage return and sometimes even more characters. They will not show up in anything but a hex-editor--but stuff will go dramatically wrong. On the other hand, the windows version of diff (and the script-engine) may consider the lack of a carriage return as a continuation of the same line, which can give you bizarre results which are difficult to track. Always check syntax. It never hurts to open a file in a hex-editor first, when trying to track down a problem.
If you are calling a windows machine with a script from a Unix/Linux environment, pipe the returned text-file through the text filter "To-Unix". Do a search on http://www.google.com/linux for import text from windows.
(If you know of a program which shows the un-printable characters without the hassle let me know. The To-Unix and To-Windows translator scripts don't show the characters--they just change them--and they aren't infallible.)
I still will screw up and wonder why the "quick edit" I did in notepad results in catastrophic failure of a web-page on a Unix/Linux server. (I have replaced notepad with a "proper" editor with a switch to enable Unix type of editing, but will forget from time to time and wonder why stuff doesn't work. D'oh! ****/

BTW: Using returned information from a machine can call an alternate set of scripts to use for a Windows machine--which means you can do it all from one work-station in an auto-mated fashon--which in the long-run will save you time, effort and sanity. This way every machine which you have any responsibility for will have a full set of logs which track every patch, every update (anything you choose to include) in one easily backed-up and maintained location. It will simplify all of the administration tasks, the generation of reports, disaster-planning, and security which you may be resposible for.

BTW-the sequel: Windows and Linux both have tools which will report all of the patches and other information from the machines. (Like a software inventory for all of the windows machines for the BSA, and a hardware inventory and chipset information for you.) Neither will report--or should report--any sensitive information: like passwords, private keys, administrative passwords, or any sensitive information which can be used to gain entry to your network, your servers, your wireless vpn keys, encryption keys, etc. Much of that information, aside from their personal password, should be un-available to the USERS of the individual machines.

Yes, it is a real pain in the ass to set up, (and can be done from your workstation if you are cunning/diabolical enough) but security is a must anymore. The laws are getting strict, and no one knows exactly how much legal exposure a company--or the administrator--has until every possibility is decided in court. It is best to be paranoid.

Last edited by Eqwatz; 01-28-2004 at 01:18 PM.
 
Old 02-02-2004, 10:36 AM   #6
webamoeba
LQ Newbie
 
Registered: Nov 2003
Posts: 18

Original Poster
Rep: Reputation: 0
Good God - look at all that writing - I suppose I ought to read it.
Quote:
Why are you: rm $1?
Simple - to remove the file from the root of the harddrive - I don't want to download direct to the desired location incase it fails and it's overwriting an existing file e.g. lilo.conf

Quote:
Ok. I'm a little bit paranoid, make sure that any errors or failures that you can think of will return some sort of message either to a log, standard out, or as a generated e-mail to the administrator (you). The last is much preferred.
ahhh. are you familiar with Novell ZEN imaging? I can't email - this is a very cutdown kernel - making a log file is all very well, but whose gonna read it when there are 300machines? I could TFTP the error log upto the server - but I don't want to give the server write access for TFTP. This is only for non-essential sorta stuff - so if a few machines are missed it's not a problem.

Quote:
You initialize a tempfile then >> (redirect+concatenate-which is to add the next item on a new line at the end of the file.) Even though it adds to the size of the script, always interpret error messages in an easy-to-understand way.
You can, in fact, by using "escape characters" echo human-readable COMMENTS to your log-file and still leave it parse-able by scripts.
Then, add a script function to e-mail all pertinent information back to you. And, don't forget to clean-up and remove the tempfile after all is done, success or failure.
(It is best to use a random number generator to make sure the tempfile_name is unique so you don't clobber some other script function you may have or may write in the future. Don't trust that every environment of every machine will do this for you.)
Sounds good. (I'm new, don't forget, lol)

Quote:
You can set up a special e-mail account for this, with a script which parses the generated message for the: host, group, region, location, user, the success/error messages; use this to create an administrative log for each specific machine. Then discard the e-mail itself after processing it.

Set it up to create a file if one doesn't exist--or--concatenate messages to an existing file with a generated solid line or some sort of easily identified break between entries and some sort of identification for each of the installations. Most likely this is not going to be a one time deal, always try to create a tool which can be used again with no editing for each use. **

Having a broken line or other easily identified break means that you can use variables as part of the script to keep track and speed up automated searches of each log-file. Then, one file within the directory you set up will have unique entries for all of the types of breaks: policies, software-updates, patches, software, settings, private-public keys, vpn-IDs, changed-Admin-passwords, changed-user-passwords, anything you want to keep track of; using line numbers, generated IDs, and "friendly" human readable names; this you can use to quickly check any identified machine which you are responsible for--for any problems or installation information. Each machine can be sorted by group or any other information you choose to use in the header of the logfiles, and each machine will have its own log-file.***

These "breaks" can be made unique pretty easily for each kind of task. All you need is a unique pattern which can be matched using Regular Expressions. (You embed the DATE/time in the breaking line after enough characters to ensure that identity is unique for your intended use. I'll tell you this, the use of Regular Expressions is explained by "Rute User" better than any programming manual I have ever read. I have read many.

The tasks to which I refer can be: scripts for generating private keys--which then can be uploaded to you using your public key for encryption so they will remain secure (along with the specific machine public-key), which then enables you to securely change administrative passwords, vpn keys, secure-shell keys, new encryptions/logins/passwords for wire-less networking, MAC address assignments, and other stuff you need to do.
you can change the MAC address????
Quote:
Automated searching of the machine_specific_log-files you create can enable you to create scripts to change: every machine and user-specific network login, encryption, services, settings, software, group-memberships, priveleges, users, available server and network resources, anything you can do physically to the machine with the exception of adding and removing hardware; remotely from your workstation, and as securely as if you were doing it at the keyboard on the machine itself.

Why? because each machine has it's own encryption, which is imported by the script you write, and no unencrypted communication goes over any internal or external network.

You literally can change the whole network--including windows machines--with one command from your administrative console, without having to look up anything. That is the power of a bash script.
And, of course, all of the new machine-specific encryptions and information would be appended to the specific log for each machine/programmable device.

Planning the layout of the log-file is important, any information you choose to include in the header can be used to generate reports and sorted information by whatever entries you choose to include in the header.****
head starts hurting

Quote:
NOTE: Size alone is not sufficient to verify a file transfer. I know this from bitter experience. If a file is tar-ed/compressed/packed/zipped/bzipped/whatever then there is a check-sum value in the header to verify the file integrity. That is why .iso files always have an MD-5 check-sum available on the server to check integrity. It is always good policy to distribute files with some means of verification. You can automate the unpacking and/or verification as part of the script.
Good point - will look into this
Quote:
The way you can do this is: create another tempfile-with a corresponding variable_name; download the bzipped file to the tempfile; attempt to unzip it to the location you wish for the final location; process the returned success or failure/error value of the utility within your bash-script. It would be a Very Good Idea (tm) to include in your bash-script a test for which type of compression is used on the object you want to download from the server and to call the specific program to de-compress it; this avoids the kind of errors I make which drive me up the wall. (Sometimes I wish I could fire myself.)
*g*

Thanks for all the tips on the end. WIll take a look at the things included in the ZEN distribution more carefully.

Thnaks again.
 
Old 02-03-2004, 09:58 AM   #7
Eqwatz
Member
 
Registered: May 2003
Distribution: Slack Puppy Debian DSL--at the moment.
Posts: 341

Rep: Reputation: 30
ABOUT MAC ADDRESSES:
Most programmable devices will in fact allow you to change the MAC address. "Crackers" use this ability as part of a "spoof" attack to gain entry to wireless (and regular) improperly secured networks.

The MAC address has not been a viable means of authentication since (perhaps) the early 1990s. (Actually, it could be earlier. I have never seen an external modem which wasn't capable of this--even if you had to do it with dip-switches.) It doesn't have enough digits available to cover every device ever manufactured which can communicate on a network.

Some devices were designed to be reprogrammed from the very same network which they report their MAC.

Whether it is something you need to worry about depends on the level of security you are trying to maintain for your particular situation.

ABOUT MY BEING A MORON:
Sorry about all of that, I am not familiar with ZEN --or any limitations it may have. I in-appropriately assumed a bash-script--I found that learning bash-scripting enabled me to understand and utilize other scripting languages (including windows).

Also, I admit to using e-mail for things for which there are other means of communication between the systems because that is what I learned first; certainly not because it is the best or only way to do it. I was trying to articulate an example, not your specific solution.

Regardless of the operating system, or the application, there will always be a means of generating records with success/error messages and a means to generate reports from the results.

SCRIPTING IN GENERAL:
I will admit freely that it is something I do not enjoy, and that I have to review everything in detail before I write a new one. (You could say that I am a "new-bie" every time I am forced to write a script--regardless of the language in which it is written or the O.S. upon which it will run.)

/** I also will freely admit that there was much "wailing and gnashing of teeth" when I was learning this stuff, the learning curve is steep. **/

_____-------_____-------______-------______

MY FEEBLE EXPERIENCES:
What I have learned from brutal experience is:

Define with exacting precision what it is I want to do.

Do a search on the net to make sure someone hasn't already written a script or freely available program which does either what I want to do, or can be cleaned up and altered to do my defined task. I am very guilty of "re-inventing the wheel". Unless someone wrote it on drugs--it is faster to clean up and reuse than start from scratch.

After taking a complete mental break(1), return to it and define the individual tasks which will--when added together--complete the "job".

From that point on, the scope of your attention should be limited to each task from start to completion.

After taking another complete mental break, try to define all possible errors for each individual task and how they will be handled--so in the event of failure (for any individual task) everything will be cleaned up and end gracefully with an understandable (and usable) error message.

/** NOTE: Never count on "garbage collection" to clean up after you, it may work well most of the time--yet leave you with an intermittent problem which will cause grave mental injury to you when you need to track it down at a later date. Aside from a few *tics* and twitches, I am almost recovered from the last time I trusted the "built-in garbage-collection" in a scripting language. **/

Take some time to look at the script to see if (with little effort) it can be transformed into a usable tool or utility--always look for ways to make anything you write reusable. It will save you time in the future. It is a source of justifiable professional pride, and it is a form of currency to gain access to other people and scripts which they have written.

Oh, Yes. Always take pride in your work. Document who wrote it, the date and time of the release, the version and updates to it, and what it does. Even if you don't release it to the public and you are the only one who will ever use it. For one, it will give you a sense of accomplishment above just having the thing work. For another, you may not remember what it is, and may not want to run it with --help or -? You are going to put in something which will enable echoing usage, version, and options (if you decide to make it a utility) aren't you?

One provision on any script or utility which is very nice to have is the option to return a human-readable success message if someone uses -v (verbose) when calling the script.

(1) I have to walk away (physically or mentally) and do something completely different for at least a few minutes. If I don't, I personally end up with un-necessary complexity, or incomplete error handling, or something "quick and dirty" which isn't well thought out (or reusable). Just by my postings you can tell that I have a tendency to make things way more complicated than they have to be.

/*** The script I was describing was something I was working on and never got around to finishing. Why? Because someone had already written a program which does pretty much the same thing in a similar way. (I hadn't done a search before starting to write it. I have a bunch of stuff like that. Since I don't write scripts or programs on a daily basis, I don't have the opinion that mine will be "THE ANSWER", or "THE KILLER-APP". Most of the scripts you find on the Internet in the news-groups are clean and very good. People who post finished scripts are courageous and meticulous. ***/


Last edited by Eqwatz; 02-03-2004 at 11:19 AM.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Shell scripting Problems! herve2001 Linux - General 7 02-26-2005 08:57 PM
Another easy Bourne Shell Scripting question WinterSt Programming 7 09-02-2004 07:41 PM
problems with scripting in expect wedgeworth Linux - Software 1 10-07-2003 03:34 PM
Scripting Problems dolvmin Linux - Software 4 08-15-2003 12:33 AM
scripting problems.. imnna2 Linux - Newbie 2 04-02-2001 09:44 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 02:55 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration