LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (https://www.linuxquestions.org/questions/linux-software-2/)
-   -   CRON Jobs (https://www.linuxquestions.org/questions/linux-software-2/cron-jobs-116747/)

scottpioso 11-16-2003 02:33 PM

CRON Jobs
 
Hello,

I am running a SMB server that I use for centralized file storage as well as a software storage center so that my clients can easily access the software storage center and download software that they see fit.

My question is this, I would like to set a cron job to automatically download daily updates to an anti-virus software and possibly push those updates down to the clients. I know it is possible to do the first thing, but the second, I'm not sure about.

Someone have any ideas? Thank you.

Tinkster 11-16-2003 02:42 PM

Hi Scott,

and yes, the Cron job download is a piece of cake :)

The other thing (pushing the files) is quite straightforward,
too, if the clients happen to be linux machines. You could
set-up a password-less ssh connection for a particular
user (like antivir) and then simply
scp new_signatures antivir@<target_host>:/<apth>/<to>/<sig>

Cheers,
Tink

scottpioso 11-16-2003 02:50 PM

Hi Tink,

Thanks for the help. Now since I know it's possible, let me go into specifics. The clients are Windows clients (hence my use of smb).

I use Norton Anti-Virus on the Windows machines and I would like to automatically retrieve the daily Intelligent updater definitions. I get those from this site:


http://securityresponse.symantec.com...es/US-N95.html

I download those into a directory called:

/software/Norton Definitions

Now, as you may or may not know, those updates are daily. Can you tell me what I need to do to set up my linux server to retrieve those updates?

The second batch of instructions you gave me are a little harder for me to understand. I've only been working with Linux for less than a year and I'm still learning quite a bit, however, my time is limited now due to my work schedule. Thanks.

Tinkster 11-16-2003 03:05 PM

Hummmm :)

I'd go with using a bash-script ...
first, download the pages html code from the site
using wget, then filter the page for .exe and .zip
and use wget again to retrieve them.

As for the distribution bits ... I'm not sure about
ssh clients for windows, and interaction with
linux' ssh passwordless authentication, and
as for the time being am not going to read up
on it. :} But you could write a little windows.cmd
file, store that on the samba server, and use
windows' scheduled task feature to run it.

Cheers,
Tink

scottpioso 11-16-2003 03:21 PM

Very interesting,

I knew that there would be a way to do this. . . . However, not to sound stupid, but I've never used WGET before. Can you help me with this?

Secondly, you mention filtering out for the .exe and .zip files. Umm, I need some help with this. As I said, I've not been using Linux for more than about 8 months so your patience would be appreciated. Thanks.

Tinkster 11-16-2003 04:21 PM

Code:

#!/bin/bash
wget http://securityresponse.symantec.com...es/US-N95.html
for getIt in `awk -F'"' '/=.+\.(zip|exe)\>/ {print $2}' US-N95.html`
do
  wget http://securityresponse.symantec.com/$getIt
done

I haven't tested it since I don't need the
files, but it should work :} Have a play with it.


This will dowload all EXE's and ZIP's from that
page .... hope this is what you needed?


Cheers,
Tink

scottpioso 11-16-2003 05:13 PM

Yep, that is what I wanted. Thanks a lot!! With that script, at what times will it run or how would I set it to run at a specific time? Tell me, did you write that script out yourself? I've only had one Unix class and our shell scripting segment didn't get nearly that complex. I'm kind of a total novice when it comes to writing shell scripts.

Secondly, and I don't know if you can help with this, but you mentioned setting a scheduled task on the windows machine to run it, however, with that I would have to click on the dialog boxes that appear when the Intelligent Updater is run.

Thirdly, what directory would that script have to be located and what should I name the file in order to run?

And Fourth, I wish the files to be downloaded to a specific directory, where would I specify the directory to save the files to?

Tinkster 11-16-2003 07:24 PM

Quote:

Originally posted by scottpioso
Yep, that is what I wanted. Thanks a lot!! With that script, at what times will it run or how would I set it to run at a specific time?
You'll have to set-up a cron job for that.
This script is just what's meant to run at a
given time that you have to choose yet.

Quote:

Tell me, did you write that script out yourself? I've only had one Unix class and our shell scripting segment didn't get nearly that complex. I'm kind of a total novice when it comes to writing shell scripts.
The script is simple, so is the call to awk :}

Quote:

Secondly, and I don't know if you can help with this, but you mentioned setting a scheduled task on the windows machine to run it, however, with that I would have to click on the dialog boxes that appear when the Intelligent Updater is run.
Ummm ... maybe we should move that part
to the General forum mate, this has NOTHING
to do with Linux anymore ;)

Quote:

Thirdly, what directory would that script have to be located and what should I name the file in order to run?
Whatever you wan to call it. And put it where ever
you feel like. For cron to be able to run it you'll have
to specific the full path to it, anyway.
You'll have to
chmod u+x <filename>
to be able to run it.
But if you're doing nifty things like this you
really should read up on what you're doing ;)

Quote:

And Fourth, I wish the files to be downloaded to a specific directory, where would I specify the directory to save the files to?
If you had a specific directory in mind you'd
probably want to put a
cd /<your>/<specific>/<directory>
as the second line of that script.

Cheers,
Tink

scottpioso 11-17-2003 02:57 PM

Hi Tink,

You said the second line of this script? So you mean after bin/bash or after wget?


#!/bin/bash
wget http://securityresponse.symantec.co...ges/US-N95.html
for getIt in `awk -F'"' '/=.+\.(zip|exe)\>/ {print $2}' US-N95.html`
do
wget http://securityresponse.symantec.com/$getIt
done


Also, let's say I want this script to run at 1700 (5PM) every day. Where would I put that in?

And you made a comment about reading up on what I''m doing, where would you suggest doing that? I've only been running LInux for less than a year so that's why my questions probably seem simple to you.

Tinkster 11-17-2003 03:27 PM

Hi Scott!

Quote:

Originally posted by scottpioso
You said the second line of this script? So you mean after bin/bash or after wget?
After the sha-bang line :)
add a
cd /<your>/<target>/<directory>

Quote:

Also, let's say I want this script to run at 1700 (5PM) every day. Where would I put that in?
crontab -e
add a line
00 17 * * * /path/to.your/script 2>&1 > /var/log/antivir

Quote:

And you made a comment about reading up on what I''m doing, where would you suggest doing that? I've only been running LInux for less than a year so that's why my questions probably seem simple to you.
Heh ... the questions sound simple, the answers
take some time ;) ...
http://www.icon.co.za/~psheer/book/ is a pretty good
introduction, and there's always the man pages...
man man for help on how to use them...
If you have questions about cron, for instance,
try a
man -k cron
(alternatively, apropos cron)

Cheers,
Tink

scottpioso 11-17-2003 03:51 PM

I'm sorry, I know you're kidding about the sha-bang line, but I guess I'm a litle slow on the uptake, what did you mean by that?

Tinkster 11-17-2003 04:01 PM

Quote:

Originally posted by scottpioso
I'm sorry, I know you're kidding about the sha-bang line, but I guess I'm a litle slow on the uptake, what did you mean by that?
# = hash = sha
! = bang

No worries :) ... this is being used in the
advanced bash howto (I think?) ...
So, #!/bin/bash = the sha-bang line

Cheers,
Tink

scottpioso 11-18-2003 07:56 AM

Hi Tink,

One more question for you if you would be so kind. I was wondering, since my scripting experience is very limited, if you could break down that script and tell me exactly what each command does? I'm just really curious.

Secondly, my shell scripting experience, as you can tell, is very limited and I was wondering how I could get better at doing it? Are there books like dummies books that start from the beginning, give good practice excercises, and advance? I was just curious.

Anyway, I think later today I will get started on this little project. Thanks again!

Tinkster 11-18-2003 12:20 PM

Quote:

Originally posted by scottpioso
Hi Tink,

One more question for you if you would be so kind. I was wondering, since my scripting experience is very limited, if you could break down that script and tell me exactly what each command does? I'm just really curious.
Sure, even though the script is far from complex.
Code:

#!/bin/bash  <- Tells the shell what "language" to expect, bash-script in this case
wget http://securityresponse.symantec.co...ges/US-N95.html    <- wget, download URL to current directory
for getIt in `awk -F'"' '/=.+\.(zip|exe)\>/ {print $2}' US-N95.html`    <- awk, a powerful editor, it looks through the downloaded file, searches for lines that contain a string that begins with a "=" and ends in either ".exe" or ".zip", and returns the 2nd field in this line; -F sets the field delimiter (' ' by default, '"' in our case; look at the html file with an ASCII editor and you'll know why). The matching lines (6 at the time I downloaded) are handed over to the shells built in list-feature by including the whole awk-thing in "`". Now the shell will run the loop that's defined by do/done for each occurence awk found,
do    <- that is, it will wget six files (their name each time in the variable $getIt); since the strings we found were relative addresses we just prepend them with the symantec website, concatenating the string
  wget http://securityresponse.symantec.com/$getIt
done    <- end of loop

Quote:

Secondly, my shell scripting experience, as you can tell, is very limited and I was wondering how I could get better at doing it? Are there books like dummies books that start from the beginning, give good practice excercises, and advance? I was just curious.

Anyway, I think later today I will get started on this little project. Thanks again!
Curious is a VERY good start, really :}

Look at this - either download or read online.
http://www.tldp.org/guides.html#abs

Also there are "Shell-programming for dummies" books
out there.

A highly recommendable book as a desktop-reference
is O'Reiley's "Linux in a nutshell".


Cheers,
Tink

scottpioso 11-18-2003 03:24 PM

Tink,

I really appreciate all of your help. I thought I was going to be able to get started on this today, but like so many of my days lately, I haven't been able to. See, I work nights and my days are spent mostly in bed and now I'm just getting up and I don't think now is the time to really get started on this project. I guess I'll wait until the weekend. Oh well. . . Hey, if you're ever in the area, look me up. Where are you from anyway? I'd have to guess by your use of language that you're from Austrailia?

Tinkster 11-18-2003 04:00 PM

Quote:

I really appreciate all of your help. I thought I was going to be able to get started on this today, but like so many of my days lately, I haven't been able to. See, I work nights and my days are spent mostly in bed and now I'm just getting up and I don't think now is the time to really get started on this project. I guess I'll wait until the weekend. Oh well. . .
Again, no worries ... and go for it whenever
you have the time. If you have further questions
or my explanation was TOO detailed/not detailed
enough, give's a yell ;)


Quote:

Hey, if you're ever in the area, look me up. Where are you from anyway? I'd have to guess by your use of language that you're from Austrailia?
The 0800 number doesn't really indicate an area ;)

And I'm German of slovene origin living in New Zealand ;)


Cheers,
Tink

scottpioso 11-22-2003 02:41 PM

HI ya Tink,

How's your weekend going? Mine is okay.

So far, I've written the script but I'm having problems making it an executable file. I tried using chmod a + x on it but when I try to run it, I can't. But anyway, I wanted to let you see what I've done so far. Like I said, I haven't been able to test it at all yet. Like you said, I first have to make it an executable file, correct? Maybe you could help me with that? Thanks.

-------------------------------------------------------------------------------------------------
# This shell script was written by Scott Pioso, with the help of Tinkster to
# allow network administrators to automate the downloading of Norton Anti-Vir-
# us daily virus definition updates (Intelligent Updater) to be downloaded
# automatically to a linux server for distribution to clients running an .
# This script is freely distributable and the author claims no copyright or any
# other proprietary hold on it.
# You may freely modify or adjust this script as you see fit

#!/bin/bash
cd /software/NortonDefs

wget http://securityresponse.symantec.com...es/US-N95.html
for getIt in 'awk -F'"' '/=.+\.(zip|exe)\>/ {print $2}' US-N95.html'

do
wget http://securityresponse.symantec.com/$getIt
done

Tinkster 11-23-2003 12:14 PM

Quote:

Originally posted by scottpioso
HI ya Tink,

How's your weekend going? Mine is okay.

So far, I've written the script but I'm having problems making it an executable file. I tried using chmod a + x on it but when I try to run it, I can't. But anyway, I wanted to let you see what I've done so far. Like I said, I haven't been able to test it at all yet. Like you said, I first have to make it an executable file, correct? Maybe you could help me with that? Thanks.
Weekends are always too short ;}

What error/problem are you facing with running
it? How were you trying to run it?

Cheers,
Tink

scottpioso 11-23-2003 03:53 PM

Hi ya tink,

What I did after I wrote the script was this:

cd'd into the proper directory and chmod 754 for the file norton.sh

Then I did the following:



[root@ASUSA7V266-E software]# ls
Adaware.tar.gz MSConfig-Windows2000.tar.gz Soundblaster.tar.gz
Adobe.tar.gz MSOffice.tar.gz SystemCleaner.tar.gz
Cisco Configmaker.tar.gz NortonDefs tomsrtbt-2.0.103.tar.gz
gaim-0.72-1rh9.i386.rpm norton.sh Trillian.tar.gz
Intel.tar.gz Norton.tar.gz Trojan.tar.gz
LinuxUtilities.tar.gz PQMagic.tar.gz vnc-3.3.7-1.i386.rpm
lost+found putty.tar.gz VNCforWindows.tar.gz
MeayaPopupFilter.tar.gz Quicken.tar.gz vnc-linux.tar.gz
Moosoft.tar.gz RegCleaner.tar.gz
Mozilla.tar.gz Roxio.tar.gz
[root@ASUSA7V266-E software]#

[root@ASUSA7V266-E software]# norton
bash: norton: command not found
[root@ASUSA7V266-E software]# norton.sh
bash: norton.sh: command not found
[root@ASUSA7V266-E software]#

Obviously, I'm doing something wrong, and I know it's simple, I just don't know what to do to fix it so it will run as a script.

Here also is the script that I wrote:
--------------------------------------------------------------------------------------------------
# This shell script was written by Scott Pioso, with the help of Tinkster to
# allow network administrators to automate the downloading of Norton
# Anti-virus daily virus definition updates (Intelligent Updater)
# to be downloaded
# automatically to a linux server for distribution to clients running a
# graphical user interface from another well known GUI OS manufacturer.
# This script is freely distributable and the author claims no copyright
# or any other proprietary hold on it.
# You may freely modify or adjust this script as you see fit

#!/bin/bash
# This statement is the default directory that you should create on your
# Linux system. You may change this if you see fit.

cd /software/NortonDefs

# This next statement automatically will call the system to download the files
# necessary from the Norton.com website.

wget http://securityresponse.symantec.com...es/US-N95.html
for getIt in 'awk -F'"' '/=.+\.(zip|exe)\>/ {print $2}' US-N95.html'

do
wget http://securityresponse.symantec.com/$getIt
done
--------------------------------------------------------------------------------------------------

Thanks for your help again!

Tinkster 11-23-2003 04:12 PM

Quote:

Originally posted by scottpioso
[root@ASUSA7V266-E software]# norton
bash: norton: command not found
[root@ASUSA7V266-E software]# norton.sh
bash: norton.sh: command not found
Yep ... try ./norton.sh

For safety reasons the local directory isn't
in root's path for a (probably not too obvious)
security reason. Imagine a evil user with a
local account on your machine created
a bash-script called ls containing rm -rf /,
and put that into /tmp ... if you (as root) go
into /tmp, and do an ls to look at the directory,
you could wipe your HDD ... therefore, if
you need to execute a file in a local directory,
you need to preceed it with ./ (or use the
fully qualified path).


Cheers,
Tink

scottpioso 11-23-2003 04:17 PM

Hmm, got it now, however, I'm getting some errors here. Check it out. . .

root@ASUSA7V266-E software]# ./norton.sh
--16:12:23-- http://securityresponse.symantec.com...es/US-N95.html
=> `US-N95.html'
Resolving securityresponse.symantec.com... done.
Connecting to securityresponse.symantec.com[64.124.201.45]:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 13,224 [text/html]

100%[====================================>] 13,224 137.38K/s ETA 00:00

16:12:23 (137.38 KB/s) - `US-N95.html' saved [13224/13224]

./norton.sh: line 23: unexpected EOF while looking for matching `"'
./norton.sh: line 28: syntax error: unexpected end of file

scottpioso 11-23-2003 04:22 PM

Also I just found that it's downloading the entire page, not just the zips and exe from the page. Why is that happening do you think?

Tinkster 11-23-2003 05:00 PM

Quote:

Originally posted by scottpioso
Also I just found that it's downloading the entire page, not just the zips and exe from the page. Why is that happening do you think?
The error seems to be that you have
replaced the
Code:

`
around the entire awk statement with
Code:

'
As for the fact that it downloads EVERYTHING,
I wouldn't know ... maybe you have a different
version of wget, maybe there's an alias, ...


Run the wget manually, check whether the entire
site comes, or just the page you actually requested.



Cheers,
Tink

scottpioso 11-23-2003 05:11 PM

Hi Tink,

I don't know either, however, what seems to be happening when I run the script manually is it apparently is running the script in two steps. If you look, after it downloads, it apparently returns to normal shell prompt and then tries to run awk. Which I have to assume it doesn't know what to do with it.


[root@ASUSA7V266-E scott]# wget http://securityresponse.symantec.com...es/US-N95.html
--17:04:11-- http://securityresponse.symantec.com...es/US-N95.html
=> `US-N95.html'
Resolving securityresponse.symantec.com... done.
Connecting to securityresponse.symantec.com[209.8.166.179]:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 13,224 [text/html]

100%[====================================>] 13,224 167.72K/s ETA 00:00

17:04:11 (167.72 KB/s) - `US-N95.html' saved [13224/13224]

[root@ASUSA7V266-E scott]# for getIt in 'awk -F'"' '/=.+\.(zip|exe)\>/ {print $2}' US-N95.html'
> do
> wget http://securityresponse.symantec.com/$getIt
>
> done
>

Tinkster 11-23-2003 05:18 PM

Quote:

Originally posted by scottpioso
[root@ASUSA7V266-E scott]# for getIt in 'awk -F'"' '/=.+\.(zip|exe)\>/ {print $2}' US-N95.html'
If the board didn't change them you're still
using the wrong quotes ....
Code:

[root@ASUSA7V266-E scott]#  for getIt in `awk -F'"' '/=.+\.(zip|exe)\>/ {print $2}' US-N95.html`

Cheers,
Tink

scottpioso 11-23-2003 05:26 PM

I'm sorry? I'm not exactly following you. I'm using the single quote which is right below the double quote key to the right of the colon keys ;: Which quote should I be using?

Tinkster 11-23-2003 06:44 PM

Quote:

Originally posted by scottpioso
I'm sorry? I'm not exactly following you. I'm using the single quote which is right below the double quote key to the right of the colon keys ;: Which quote should I be using?
The one above the TAB key (primary function
to the ~ (tilde) character) ...

Cheers,
Tink

scottpioso 11-23-2003 06:57 PM

ah ha!! Okay, well, so every place where you see a single quote should have that type instead of the one I was using, eh? Cool. Well, I'll have to continue this tomorrow as I have to take a nap now as I have to be at work in three 1/2 hours. I hate working nights. ARghhh!! Thanks a lot Tink

teval 11-23-2003 07:21 PM

You can do this much much simpler...
Make wget filter out what files it downloads.
info wget will give you that..
info wget Following\ Links Types\ Of\ Files

That will tell you how to use it. Will replace that entire script with something simple :)
It should be:

Code:

wget -A *exe,*zip,US-N95* --follow-ftp -H -k -r -N -l 1 -w 0 -nd -e robots=off http://securityresponse.symantec.com...es/US-N95.html
Yes.. it does download 4 extra html files, but it deletes them immediately, and doesn't get the gif files from them or anything. All.. you get is.. 1 html file and all the exe files you want.
Just like the other one.. but this is less error prone.
Also.. you don't need a script for it, just dump it into cron as is :)
And.. if you know there are certain exe or zip files you never want, just add an option in there:
-R <partial name with * where it fits in>

scottpioso 11-24-2003 07:58 AM

Hi Teval,

Thanks for your input. Perhaps you can tell me how to "dump this into cron" as is. What should the name of this file be and do I still have to chmod 754 to it to make it executable?

The tinkster has really tried to help me and I sure appreciated all of his help. But as I said, the script didn't do what I thought it was supposed to do.

teval 11-24-2003 04:04 PM

So you want a program that just downloads the exe and zip files?
If you want just that, put that in cron, with an added -P option that works like this

-P <directory wget should download to>

Just add it to the current wget arguments :) This would be the program that cron runs. It's just wget, with a lot of arguments.

What exactely do you want?

Tinkster 11-24-2003 04:56 PM

Quote:

Originally posted by teval
You can do this much much simpler...
Make wget filter out what files it downloads.
info wget will give you that..
info wget Following\ Links Types\ Of\ Files
Heh ... I should be downloading more, thanks
for bringing that up :}


Cheers,
Tink

scottpioso 11-24-2003 05:25 PM

Hi Teval,

What exactly do I want? Well, I want to automatically schedule a job to download the Intelligent Updater updates at a specfic time every day into a directory called /software/nortondefs. I would like this to run at 1800 hours local time Mondays through Fridays (as Norton doesn't write updates on weekends). I also wish to push these updates down to the Windows clients for which this is for in the first place but that's another issue that I will address later in a different forum.

With all respect, if you had read my first postings, you would have seen what I was trying to do. So, in this regard can you help me to simplify this task? My scripting experience is very, very limited and I'm quite new to Linux as well, so this is why I kind of need you to take baby steps with me. After I do more and more, it gets easier but for now, your patience is appreciated. Thanks

teval 11-24-2003 08:35 PM

No problemo

That wget script with the -P option will handle the downloading.
As for upshing to the windows clients, you can do that over Samba, but I've never had to interoperate with Windows, so I have no clue how to. Look up Samba on the web, take the command you find, and the wget command, put them in a script, and off you go. :)

scottpioso 11-25-2003 07:50 AM

Hi Teval,

I'm confused though, you say to just dump the script into cron. I'm not clear about what you mean exactly. Thanksl

teval 11-25-2003 09:39 AM

I thought that you only wanted to get the files so I was suggesting you put the wget straight into cron.
For more then 1 command you should use a script, or &&'s.
&& is also good because it checks return values, and if an error occurs aborts everything. An sh script is better though :)

scottpioso 11-25-2003 01:46 PM

Okay thanks then Teval,

So I guess I need to go back to what Tink was suggesting. So, can you help me to accomplish what I was doing? And secondly, your use of language is not clear for me to understand. i.e. "For more then 1 command you should use a script, or &&'s.". As I said before, my scripting experience is very limited, so baby steps, buddy, baby steps. Thanks.

Tinkster 11-25-2003 01:56 PM

The double & means to the shell that in
"command1 && command2"
command2 will only be executed if command1
was successful. It is, so to speak, chaining two
commands, and it's not a shell scripting thing,
you can do that on the command-line as well....

e.g. if you ran
make menuconfig
to configure your kernel, you can then continue
with
Code:

make dep && make bzImage && make modules && make modules_install

Cheers,
Tink

scottpioso 11-25-2003 01:59 PM

Hmm, okay thanks Tink,

Well, I will continue this project on the weekend. This is a holiday weekend for us so I have Wednesday and Thursday night off.

scottpioso 11-26-2003 09:38 AM

Hi ya Tink,

Well, I'm home for a couple of days and as mentioned before, I'm getting an error when I run the script and it's also downloading the entire page instead of just the zips and exe files. This is the error. . . Could you tell me what's wrong?

[root@ASUSA7V266-E scott]# ./norton.sh
--09:30:30-- http://securityresponse.symantec.com...es/US-N95.html
=> `US-N95.html'
Resolving securityresponse.symantec.com... done.
Connecting to securityresponse.symantec.com[64.124.201.150]:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 13,224 [text/html]

100%[====================================>] 13,224 113.28K/s ETA 00:00

09:30:30 (113.28 KB/s) - `US-N95.html' saved [13224/13224]

./norton.sh: line 24: unexpected EOF while looking for matching `"'
./norton.sh: line 30: syntax error: unexpected end of file

Here are lines 23 and 24:

wget http://securityresponse.symantec.com...es/US-N95.html
for getIt in 'awk -F'"' '/=.+\.(zip|exe)\>/ {print $2}' US-N95.html'

and there isn't a line 30. The last line is at 29.


Here is the script as it is in its entirety:

# Anti-virus daily virus definition updates (Intelligent Updater)
# to be downloaded
# automatically to a linux server for distribution to clients running a
# graphical user interface from another well known GUI OS manufacturer.
# This script is freely distributable and the author claims no copyright
# or any other proprietary hold on it.
# You may freely modify or adjust this script as you see fit

#!/bin/bash
# This statement is the default directory that you should create on your
# Linux system. You may change this if you see fit.

cd /software/NortonDefs

# This next statement automatically will call the system to download the files
# necessary from the Norton.com website.

wget http://securityresponse.symantec.com...es/US-N95.html
for getIt in 'awk -F'"' '/=.+\.(zip|exe)\>/ {print $2}' US-N95.html'
do
wget http://securityresponse.symantec.com/$getIt

done




Can you help me out here? Thanks, bud.

scottpioso 11-26-2003 11:34 AM

Ah ha!!! I have the script functioning, Tink. However, it's downloading more than what I really need, so I'll have to play with it.

scottpioso 11-26-2003 02:40 PM

Hello yall,

Well, I've done it whoooohoooooo!!!! I spent about 5 hours on it today and here it is in its final version. . . .

--------------------------------------------------------------------------------------------------
# Norton Anti-Virus Intelligent Updater Download Script
#
# To run this script, change to the directory where it is located and type
# ./norton.sh to launch the program
#
# The following is a Shell Script to automate downloading of Norton anti-virus
# virus Definitions via the Intelligent Updater for storage on a Linux server
#
# This shell script was written by Scott Pioso, with the help of Tinkster to
# allow Linux network administrators to automate the downloading of Norton
# Anti-virus daily virus definition updates (Intelligent Updater)
# to be downloaded automatically to a linux server for distribution to clients
# running a graphical user interface from another well known GUI OS
# manufacturer.

# This script is freely distributable and the author claims no copyright
# or any other proprietary hold on it, You may freely modify or adjust this
# script as you see fit

#!/bin/bash

# The next statement is the default directory that your Linux system will use
# to store the files downloaded. The directory should be created prior to
# running this program. You may change this directory to something else if you
# desire, however, if you do so you must match that directory to the one listed
# below

cd /software/NortonDefs

# This next statement automatically will call the system to download the files
# necessary from the Norton.com website. This step will download all files from
# the page with an .exe file extension

wget http://securityresponse.symantec.com...es/US-N95.html

for getIt in `awk -F'"' '/=.+\.(exe)\>/ {print $2}' US-N95.html`

do

wget http://securityresponse.symantec.com/$getIt

done

# Finally, the last step is to remove the extraneous files that the author did
# not want to save

rm *.html*
rm *i32-1*
rm *x86*

# Now, if you look in the directory created (NortonDefs) you will see one file
# listed. Should you wish to automate this process and push these files down to
# your MS Windows clients, it will be necessary to create a CRON job to do this

# Questions may be directed to the author, Scott Pioso at scottpioso@yahoo.com
# Thank you and long live Linux!!!!

Tinkster 11-26-2003 03:42 PM

Scott, you could have just used Teval's wget
line and stuck that into your script, and could have
given the for loop a miss ...

I still don't understand why it's getting the entire site,
though. Did you try to find out in which step that
happens? Is it in the original (1st use of wget) or
while you're wgetting from the loop?



Cheers,
Tink

scottpioso 11-26-2003 04:02 PM

Hi Tink,

The problem I was having was related to the quote symbol problems I had before. I was using the wrong quotes and once I figured that out, everything fell together nicely.

scottpioso 11-27-2003 10:02 AM

Hi ya Tinkster,

One last question related to this topic. . . You mentioned going into the crontab file to fully automate this. This is my crontab file as it is. Where do you suggest putting in my job to run?
---------------------------------------------------------------------------------------------------
SHELL=/bin/bash
PATH=/sbin:/bin:/usr/sbin:/usr/bin
MAILTO=root
HOME=/

# run-parts
01 * * * * root run-parts /etc/cron.hourly
02 4 * * * root run-parts /etc/cron.daily
22 4 * * 0 root run-parts /etc/cron.weekly
42 4 1 * * root run-parts /etc/cron.monthly

Tinkster 11-27-2003 12:37 PM

Anywhere in that list will do, it's not required
to be sorted ... so, if you wanted your script to
run every evening at 8 o'clock you'd go


01 20 * * * * /<path>/<to>/norton.sh


Cheers,
Tink


P.S.: Your "file" looks nothing like mine :)
Where/how did you get that list?

scottpioso 11-27-2003 01:34 PM

Where did I get the file? I went to /etc/crontab. Second, isn't 8:00 in the evening 2000 hours? I'm confused as to how the times correlate to real time.

Also, will the file run in the background so I won't be interrupted with what I may be doing or will it interrupt me?

Tinkster 11-27-2003 02:50 PM

Quote:

Originally posted by scottpioso
Where did I get the file? I went to /etc/crontab. Second, isn't 8:00 in the evening 2000 hours? I'm confused as to how the times correlate to real time.

Also, will the file run in the background so I won't be interrupted with what I may be doing or will it interrupt me?

Hmmm ... must be a RH thing... if I want to
add something to crontab I do
crontab -e, the result is then stored in
/var/spool/cron/crontabs/<user-id>


As for the format
man crontab
Code:

      # MIN HOUR DAY MONTH DAYOFWEEK  COMMAND
      # at 6:10 a.m. every day
      10 6 * * * date

      # every two hours at the top of the hour
      0 */2 * * * date

      # every two hours from 11p.m. to 7a.m., and at 8a.m.
      0 23-7/2,8 * * * date

      # at 11:00 a.m. on the 4th and on every mon, tue, wed
      0 11 4 * mon-wed date

      # 4:00 a.m. on january 1st
      0 4 1 jan * date

      # once an hour, all output appended to log file
      0 4 1 jan * date >>/var/log/messages 2>&1

So yes, it's 2000 (military speak) but
00 20 for cron ...

Look at this line:
# MIN HOUR DAY MONTH DAYOFWEEK COMMAND

Cheers,
Tink

scottpioso 11-27-2003 06:23 PM

Ah,, I see now, Tink. Thanks. However, you didn't answer my question about running in the background or will this interrupt whatever I may be working on at the time the job runs?

Tinkster 11-27-2003 07:23 PM

No mate, no interruptions ... that's what cron is
for. Jobs running un-attended.


Cheers,
Tink


All times are GMT -5. The time now is 04:43 PM.