LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Security
User Name
Password
Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.

Notices


Reply
  Search this Thread
Old 07-14-2009, 04:09 PM   #1
helptonewbie
Member
 
Registered: Aug 2006
Location: England Somewhere
Distribution: Mandriva, PCLinuxOS, Karoshi, Suse, Redhat, Ubuntu
Posts: 518

Rep: Reputation: 39
Question Log all commands executed by any user? Possible


Hi All,

I've been looking about for the last few hours at least for various methods to try logging commands executed in the shell. We are required to do so on servers to meet some new business requirements.

Suse Servers mainly....

I've seen a little on snoopy, but that seems very old... i've seen some people say the 'script' command can do this... i'll take a look at that.

But i'm looking for something that will log all commands run by any user on the system which will log date/time and commands plus their arguments etc. All to hopefully be sent into syslog on the server which then sends the data to syslog central logging server.

I've seen a few posts and web pages but nothing really seems to come up with definative answer.

Is there anything about A. mainly for linux that B. could be used perhaps on HP-UX also.

Ideas anyone?

Cheers,
M
 
Old 07-14-2009, 04:30 PM   #2
jhwilliams
Senior Member
 
Registered: Apr 2007
Location: Portland, OR
Distribution: Debian, Android, LFS
Posts: 1,168

Rep: Reputation: 211Reputation: 211Reputation: 211
Personally, I think this is intrusive. If you have hard security and good permissions, you shouldn't need to be spying on people.

Write a cron job to cat and sort /home/*/.bash_history > /var/log/user_commands

If users are doing something you don't like, then disable the offending functionality (what they used and what you're using. ;-).)

Non-trivial commands show up in system (or user) logs in other ways, anyway. There's no need to be saving info on every
Code:
ls /home/donna/bermuda_vacation/my_fav_pic.JPEG
.

Last edited by jhwilliams; 07-14-2009 at 04:32 PM.
 
Old 07-15-2009, 09:27 AM   #3
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Yes, it's intrusive, but given the right reasons that is the point. I've written a few times about that kind of logging, last time was here: http://www.linuxquestions.org/questi...fedora-739179/. To the OP: please see post #4 because it applies to you as well, especially the "write in detail about the purpose" part which you should address here.
 
Old 07-15-2009, 10:24 AM   #4
mlnutt
Member
 
Registered: May 2006
Posts: 34

Rep: Reputation: 15
See http://bash-hackers.org/wiki/doku.ph...ng/bashfaq/077 for the problems regarding the reliability of a user's bash history.
 
Old 07-15-2009, 01:47 PM   #5
helptonewbie
Member
 
Registered: Aug 2006
Location: England Somewhere
Distribution: Mandriva, PCLinuxOS, Karoshi, Suse, Redhat, Ubuntu
Posts: 518

Original Poster
Rep: Reputation: 39
Yep no worries, i've been trying to reply all day... but seems linuxquestions had some database issues? or it certainly errored with database issues earlier today. I'd not use the bash_history method because i know it totally un-reliable.

I've also been trying to say its mainly root and admins that require logging due the the PCI requirements 10.2.(something). Which requires logs of commands etc. (how to do this on windows should be interesting :-) )

I've still not had the time yet to read up on the 'script' commands i've heeard about in places... Unspawn i'll read your link now (thanks!!) And report back.

Cheers.
 
Old 07-15-2009, 01:54 PM   #6
helptonewbie
Member
 
Registered: Aug 2006
Location: England Somewhere
Distribution: Mandriva, PCLinuxOS, Karoshi, Suse, Redhat, Ubuntu
Posts: 518

Original Poster
Rep: Reputation: 39
lol, reading that post i see some interesting QUOTE 'i'm not really bothered about the lessor commands' - Nice touch.

As it happens i'm very interested in all commands and i'll take a look at rootsh etc as sugested. I'm guessing Bash doesn't have anything by default then (YET!). I say yet because surely it would be something worth implementing as its a requirement now for such things as PCI which happens to be exactly the reason the other guy in that post was looking for something.

Cheers,
M
 
Old 07-15-2009, 01:59 PM   #7
helptonewbie
Member
 
Registered: Aug 2006
Location: England Somewhere
Distribution: Mandriva, PCLinuxOS, Karoshi, Suse, Redhat, Ubuntu
Posts: 518

Original Poster
Rep: Reputation: 39
I've just taken a quick look at the main ideals with the rootsh and sudosh projects. Both look to be useful, however with sudosh i notice its not been further developed since what 2005. I'm always very consoius when something hasn't been updated for such a long period of time... especially for production systems, i see it so often when i find a bit of decent looking software then see its not been updated for ages.... is it me being over causious maybe???
Cheers,
M

ps - i'll try out rootsh for that reason at least it was last touched 2008.

pps - this goes for things also like rootkit checkers and stuff, they all seem way out of date? Or the ones that I seem to find talked about most often are way out of date anyway... what gives? So i question the point in installing something that perhaps can't detect any rootkits made since its last update (unless they don't work similar to a virus checker which checks signitures against a database... but still surely there must be somethings to update in those types of things.)

Last edited by helptonewbie; 07-15-2009 at 02:03 PM.
 
Old 07-15-2009, 04:23 PM   #8
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Yes, we've had some database probs but now they're gone. Rootsh would be a good choice. I use it myself (as if that's an objective criterium). There aren't as much rootkit incidents as there were before. And you're right, detection is a disconcerted effort. I'd suggest giving it equal or less thought that integrity checking (you definately want Samhain) and MAC (I mean SELinux, I don't know AppArmor).

Back to things at hand: if you're up for PCI-DSS requirement then you'll appreciate playing reverse scenarios: pick a date and time and a user, then try to supply all the information required to prove this user U at that time T and place P only issued those commands C with result R. (Rearrange the letters and it'll spell C'PUTER but that's besides the point.) What I'm trying to convey is that whatever measures you deploy you have to aim for sufficient ways to cover all aspects. Overlap is good as it helps corroborate other "evidence". Let me know what you're proposing to install to address PCI-DSS Requirement 10 and we'll delve deeper into this if necessary.
 
Old 07-15-2009, 05:01 PM   #9
helptonewbie
Member
 
Registered: Aug 2006
Location: England Somewhere
Distribution: Mandriva, PCLinuxOS, Karoshi, Suse, Redhat, Ubuntu
Posts: 518

Original Poster
Rep: Reputation: 39
Thanks for your reply... I did have a look at samhain quite a while ago now (some months)... actually I put a thread on here about it which funnily enough you were one of the people that replied to it. I tried it out but it seemed strangely very difficult to set-up and configure... or maybe it was due to time constraints not sure, but I felt at the time it certainly didn't appear that easy. (not saying easy is always the way but you know from management/time points of view its deciding what is best spending time on).

I've already got a logging server set-up for section 10 also syslog-ng to MySQL using php-syslog front end to search the logs.. which I plan to plug the rootsh up-to of course. Using open source tripwire for File integrity checking (very easy to configure) automated email alerts etc... but that’s mainly because that is what was in place at the company before I got there(tripwire I mean)... I’m only a junior. I've looked at AIDE also but not had the time to try it out... another thing which looked very out of date and so decided on that alone really that it would be a bad choice for the reasons I gave in my last post... but I know most places I read seem to rate AIDE higher than open source tripwire. One of those trade off things I suppose.

We also have Mac OS x and Windows machines... Mac's log we'll catch just with standard syslog of course and windows probably use some windows to syslog conversion thing out there (there's a few of those). The other thing I suppose I’m having an issue with would be section 10.6... the logging and then alerting on those logs or reducing the amount of logs that actually require reading by human eye with a more automated system... ideal world would of course allow the use of splunk... but expensive it is... was then thinking most servers are less than 500MB per day so could try installing on per server basis (for the free version) but that’s again a management nightmare so now thinking in the realms of logwatch type stuff and seeing if I can get that on the central log server and do something with that.. maybe it would require SWATCH as well etc etc....? Personally though it might be easier to have logwatch on each separate server as putting it on central log server looks slightly complicated.

And some method of doing log initialisation which I’m not entirely sure on yet (10.2.6)... central log server is protected from modifications hopefully its good enough that the database is dumped nightly and then those mysql dumps are of course watched by tripwire.. so modifications would be detected (as long as tripwire holds up of course and modifications aren't done inside the database.. I’d have to assume the database hadn't been compromised?..)

The rest is pretty much all sorted for I think :-) (Hope)

I think I’ve managed to cover most/all aspects apart from the above?

Thanks for your help,
M

ps- Just out of curiosity are you some kind of security consultant or just a Linux admin of some sort.. I see you in the security forum quite a bit... I like to read the security section as much as I can.
 
Old 07-15-2009, 08:24 PM   #10
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Quote:
Originally Posted by helptonewbie View Post
I did have a look at samhain quite a while ago (..) I tried it out but it seemed strangely very difficult to set-up and configure... or maybe it was due to time constraints not sure, but I felt at the time it certainly didn't appear that easy.

Quote:
Originally Posted by helptonewbie View Post
(not saying easy is always the way but you know from management/time points of view its deciding what is best spending time on).
I know constraints, sure. The only way to tackle it qualitatively well IMHO is to have a clearly defined set of requirements, a design to satisfy those and a plan chunking the design into tasks to perform. That's basic project management for ya. Dropping things on the basis of "not being bothered" (the other thread, right) or something not being "easy" doesn't address what you should. Instead, if you dismiss something reasoning an equally good result can be achieved by other means then you keep an eye on things that matter.


Quote:
Originally Posted by helptonewbie View Post
I've already got a logging server set-up for section 10 also syslog-ng to MySQL using php-syslog front end to search the logs.. which I plan to plug the rootsh up-to of course. Using open source tripwire for File integrity checking (very easy to configure) automated email alerts etc... but that’s mainly because that is what was in place at the company before I got there(tripwire I mean)... I’m only a junior. I've looked at AIDE also but not had the time to try it out... another thing which looked very out of date and so decided on that alone really that it would be a bad choice for the reasons I gave in my last post... but I know most places I read seem to rate AIDE higher than open source tripwire. One of those trade off things I suppose.
You being "just" a junior doesn't or shouldn't matter. If you can propose a qualitatively better solution then those in charge should be able to recognize it as such (provided you use the right arguments and provided they know their job from a hole in the ground). I think I addressed tripwire in your other thread. The difference between passive and active filesystem integrity checkers I've touched on more than a few times: let me know if searching LQ for posts don't get you the right results.


Quote:
Originally Posted by helptonewbie View Post
The other thing I suppose I’m having an issue with would be section 10.6... the logging and then alerting on those logs or reducing the amount of logs that actually require reading by human eye with a more automated system... (..) thinking in the realms of logwatch type stuff and seeing if I can get that on the central log server and do something with that.. maybe it would require SWATCH as well etc etc....? Personally though it might be easier to have logwatch on each separate server as putting it on central log server looks slightly complicated.
Filtering at the origin means data never ends up in storage: it's gone forever. Filtering from storage means you have everything in case of say audit but it means you have to keep adjusting regexes to filter for reporting until it's stabilised. If it's a requirement to have everything on file then I doubt origin filtering is what you want.


Quote:
Originally Posted by helptonewbie View Post
central log server is protected from modifications hopefully its good enough that the database is dumped nightly and then those mysql dumps are of course watched by tripwire.. so modifications would be detected (as long as tripwire holds up of course and modifications aren't done inside the database.. I’d have to assume the database hadn't been compromised?..)
How would you trace say database corruption?


Quote:
Originally Posted by helptonewbie View Post
The rest is pretty much all sorted for I think :-) (Hope) I think I’ve managed to cover most/all aspects apart from the above?
Let me come up with some scenarios.


Quote:
Originally Posted by helptonewbie View Post
are you some kind of security consultant or just a Linux admin of some sort..
I'd like to have a real job when I grow up?
 
Old 07-16-2009, 05:35 PM   #11
helptonewbie
Member
 
Registered: Aug 2006
Location: England Somewhere
Distribution: Mandriva, PCLinuxOS, Karoshi, Suse, Redhat, Ubuntu
Posts: 518

Original Poster
Rep: Reputation: 39
Hi There,
Sorry for slow reply.... very busy day :-)
....
Quote:
Originally Posted by unSpawn View Post
1.Filtering at the origin means data never ends up in storage: it's gone forever. Filtering from storage means you have everything in case of say audit but it means you have to keep adjusting regexes to filter for reporting until it's stabilised. If it's a requirement to have everything on file then I doubt origin filtering is what you want.

2.How would you trace say database corruption?
1. Not to sure what your saying here.. i'm not wanting to filter out logged data, but would want to filter it for the purposes of alerting say to many access attempts, or anything really. Yes i can see it being a for the amount of Regexes that could be required. Not sure there is any other choice... logs need to be read and there's no way i can read through everything... i'd miss something... but on the same stamp... so would anything i set-up that 'regexes' the log data also miss things. Its another trade off thing i suppose.

2. I don't know alot about database corruption. I assume if i had something like nagios testing the tables for output and comparing the output is retrieved and is the expected output... is that a suffeicent thing.... i need to read more about possible database corruption and see if its something nagios may be able to detect.

Cheers,
M
 
Old 07-17-2009, 07:22 AM   #12
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Quote:
Originally Posted by helptonewbie View Post
Not to sure what your saying here.. i'm not wanting to filter out logged data, but would want to filter it for the purposes of alerting say to many access attempts, or anything really. Yes i can see it being a for the amount of Regexes that could be required. Not sure there is any other choice... logs need to be read and there's no way i can read through everything... i'd miss something... but on the same stamp... so would anything i set-up that 'regexes' the log data also miss things. Its another trade off thing i suppose.
I'm just saying that filtering at the source ain't good. As far as regexes go the same applies really. If you filter for only known errors you'll miss out on stuff, if you filter out known good lines you should find all errors. All it takes is some time to adjust things.


Quote:
Originally Posted by helptonewbie View Post
I don't know alot about database corruption. I assume if i had something like nagios testing the tables for output and comparing the output is retrieved and is the expected output... is that a suffeicent thing.... i need to read more about possible database corruption and see if its something nagios may be able to detect.
In that case you best search the 'net for phrases like "detect MySQL database corruption" and "MySQL Database Recovery" to find stuff like Managing MySQL, check the MySQL manual for things like Administrative and Utility Programs or maybe buy a dead tree like SAMS MySQL Administrator's Guide (ISBN: 0672326345)?..
 
Old 07-21-2009, 11:41 AM   #13
helptonewbie
Member
 
Registered: Aug 2006
Location: England Somewhere
Distribution: Mandriva, PCLinuxOS, Karoshi, Suse, Redhat, Ubuntu
Posts: 518

Original Poster
Rep: Reputation: 39
Quote:
I'm just saying that filtering at the source ain't good. As far as regexes go the same applies really. If you filter for only known errors you'll miss out on stuff, if you filter out known good lines you should find all errors. All it takes is some time to adjust things.
Yep totally agree with that thanks...

Quote:
In that case you best search the 'net for phrases like "detect MySQL database corruption" and "MySQL Database Recovery" to find stuff like Managing MySQL, check the MySQL manual for things like Administrative and Utility Programs or maybe buy a dead tree like SAMS MySQL Administrator's Guide (ISBN: 0672326345)?..
From reading about this in a few places the only real way to check seems to be with the mysqlcheck or was it myisamcheck thing...(i'm using myisam).. anyway i need to re-read things as you can tell.

I've now got important log messages being sent to the logging server ecrypted with stunnel but not all logs due to the extra processing it would all require otherwise...?

Thanks for your input... been useful!

M
 
Old 07-21-2009, 12:56 PM   #14
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Thanks for your reply, good to see you're making things work. In closing, would it be possible for you to share a (terse?) list of things wrt logging you implemented or changed from defaults? Might help others.
 
Old 08-10-2009, 09:47 AM   #15
rfelsburg
Member
 
Registered: Nov 2008
Posts: 52

Rep: Reputation: 18
Have you looked into psacct, sar, and sa for command history?

This has proven many times to be useful to me when tracing back user involvement, and user security snafus.

-Rob
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Log user commands and output roulette Linux - Security 5 08-17-2009 11:30 PM
ssh record executed commands??? joangopan Linux - Newbie 4 05-13-2009 10:05 AM
How to Display Commands to be Executed with At lrt Linux - Software 1 04-11-2008 11:26 AM
at - warning: commands will be executed using /bin/s RGummi Linux - General 4 10-13-2006 12:34 PM
Commands executed at boot time Johnburrell Linux From Scratch 3 09-18-2005 01:56 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Security

All times are GMT -5. The time now is 02:28 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration