Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Does anyone know of any cheaper alternatives to splunk? For what it is, it seems very expensive and there's no way that I could convince management to splash out for it.
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,197
Rep:
Well, some might argue that it's not in the same class, however, it suits my idea for what I want in this type of thing. And, it's open source, which fits your directive for cheaper. It's called SEC, or Simple Event Correlator. Simple, powerful, fast, pure perl, and not a lot of other stuff added on. I confess that I haven't actually implemented this yet, but I did spend a lot of time searching. My idea is to implement syslog-ng with a syslog server and have SEC running on the syslog server.
For similar reasons, I am in the middle of configuring mon (try googling that, I don't remember how I finally stumbled on it). I want to keep my servers as simple as possible. If a monitor program gives me a list of a gazillion things it needs installed in order to function, then I don't want it.
I don't intend to get into graphics for either of these. However, you can apply the same philosophy to that, using just RRDTool and a dirt simple (not full function) perl based web server that kicks off from inetd and has zero footprint otherwise.
I had a hard time finding the simple perl web server, and I don't remember where it is or what its name was now. It was almost as hard to find as mon or sec. There were 2 or 3 of them, but only one suited me. If I decide to go this route, I'll have to find it again.
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,197
Rep:
That was the point of using syslog-ng with a log server together with SEC. Syslog-ng is pretty much the standard for this sort of thing.
SEC is really simple, but powerful. You could put it on multiple servers, but the advantages of having a log server are significant enough on their own. You have one place to look at logs (few sysadmins spend enough time doing that), still have log history if one of the servers is compromised, etc. Typically, this would be configured so that local system logs are maintained in addition to the log server. This means that if you are working on a particular server, you have logs locally. If you are trying to correlate events across multiple servers, you have one place to look. And, if you have a compromised system, in which the hacker /dev/nulled the logging, you have the log history up to that point on the log server, and you can analyze what happened.
pmcgovern, the forum rules do not permit advertising. Please visit http://www.linuxquestions.org/advertising/ for more information on advertising. Feel free to contact the forum admin if you have any questions about this policy.
Ahhh yes ....
I had wondered about where the (advertising) line was to be drawn.
Still, I was surprised when the OP mentioned splunk was too expensive. I had always considered it as free from the truckloads of ads I had seen - on sf.net or freshmeat.net maybe.
Never bothered downloading it, so never knew the true position.
My apologies if my post appeared to be advertising. Not my intention.
Noodles25 has been looking for something cheaper then Splunk.
My only point was Splunk is free (up to 500 megabytes of indexable data a day). It's hard to be cheaper then free. Noodles may not know that there is a free version.
ccosk, Thanks. As many people are finding, 500MB of data is nothing when you start taking about log aggrigation of many servers, so the free version of Splunk will not work. As many others are doing, I am searching around for splunk alternatives. I like Splunk, I think it is a great product, but unless you are a fairly decent size enterprise, budgets are not flexible enough to cover this expense.
I am taking a look into f-deets now to see if that will work for our needs.
If you would like to help us out and provide some feedback on what you need, we can give you access to our beta. Take a look at http://www.cloudpelican.com/ and signup to stay in touch.
Resurrecting old (inactive) threads is generally frowned upon. Especially if it could be construed you have merely signed up here to do advertising.
Personally, I would be happier to see that web page have (much) more data on what you are proposing (along the lines you have posted here maybe) before surrendering my email address to be potentially spammed.
Something to consider maybe - some of us have become more suspicious over the years.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.