Linux - SecurityThis forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I would start by disallowing "root" user to login through ssh, and install fail2ban (or better, only use keys to authenticate) -- that's if you have the need to access the desktop through ssh at times.
Make sure all packages are upto date, is quite obvious. And try to only use software from "trusted" sources.
On the desktop, the main attack vector comes from the browser, you could prevent many by:
1. Disable java in your web browser, and only enable it when you really need to.
2. Disable javascript, or better install an add-on such as noscript to manage site specific rules
3. Disable adobe flash, only enable on sites you trust (ie; youtube).
4. Install an adblocker (surprisingly, most payloads will tend to come from hacked ad servers)
For physical access attacks (people that could walk up to your computer);
1. You could use an encrypted filesystem to prevent leakage of data when the hard drive is stolen.
2. Disable automounting or autorunning software when a device is connected (do not prompt when a usb is inserted for example)
3. Use a screen locker that locks accounts after a period of inactivity
It depends how deep you want to go. Obviously, the more you lock down a system, the less usable it becomes. These are meerly some things i personally choose to do when setting up a new machine.
Before my post, my security checklist was a firewall, clamav, disabling startup services that I don't need and rkhunter. Haven't though much on the browser end so I will supplement those suggestions to my checklist. Thanks
Consider using this hosts file. It will prevent your computer from accessing hundreds of malicious/invasive websites. It will also prevent the display of many ads.
I was encouraged by some of the gurus here to install tiger and run it. On Debian or Ubuntu, this is as simple as typing "sudo apt-get install tiger"
Theoretically, improving security generally means a few things:
1) Use a secure form of authentication for anyone who must connect. I usually disable password authentication and require a key pair instead.
2) disable any services you don't need which listen for connections
3) for the ones you do need, try to limit who can connect to it using configuration settings (e.g., localhost connections only), iptables, or a firewall.
4) practice safe internet habits when browsing.
Consider using http://winhelp2002.mvps.org/hosts.htm]this hosts file. It will prevent your computer from accessing hundreds of malicious/invasive websites. It will also prevent the display of many ads.
Deities know why people keep promoting it... Using /etc/hosts for redirecting lookups is a crude, past millennium way of dealing with things. To be more specific it won't help with or can't parse Javascript fun or do path-based filtering, update incrementally, block ad-tracking cookies, block in-page ads residing in a path on the same server you visit, block ads from a hostname of which the domainname is the same as the server you visit, block ads presented through Javascript or Flash, block ads by host or path substring match, block only web bugs, set session-only cookies for a range of sites, selectively block popups, refresh-tags and redirects, keep images with specific sizes from displaying or block visiting domains based on content. On top of that you'll have a hosts file filled with sites you might not even visit ten per cent of, no insight in how filters get added and nobody to vouch for its contents other than this one person.
Put differently: given the facts that 0) users get redirected to malware-delivering sites via completely unrelated sites, 1) often using Javascript (as in application vs network layer) and the 2) speed with which criminal organizations set up DNS and malware sites the value of using said (or any such) hosts file therefore will enhance best practices in a measurably negligible way.
So please consider not promoting that hosts file or the use of /etc/hosts. For effective, usable and efficient distro and browser-agnostic filtering you can't beat a filtering proxy.
Other than that the use of a custom hosts file isn't even in the top ten of security measures to take so this kind of distracts from what the OP should do.
Deities know why people keep promoting it... Using /etc/hosts for redirecting lookups is a crude, past millennium way of dealing with things. To be more specific it won't help with or can't parse Javascript fun or do path-based filtering, update incrementally, block ad-tracking cookies, block in-page ads residing in a path on the same server you visit, block ads from a hostname of which the domainname is the same as the server you visit, block ads presented through Javascript or Flash, block ads by host or path substring match, block only web bugs, set session-only cookies for a range of sites, selectively block popups, refresh-tags and redirects, keep images with specific sizes from displaying or block visiting domains based on content. On top of that you'll have a hosts file filled with sites you might not even visit ten per cent of, no insight in how filters get added and nobody to vouch for its contents other than this one person.
Apologies for suggesting it, but it's better than nothing at all I think. It certainly has blocked a lot of unpleasant ads for me. Certain sites have very intrusive ad behaviors (ahem cough cough) and I have myself noticed a markedly improved user experience visiting them with a hosts file blocking a lot of common ad networks.
You are entirely correct that this approach won't help you out-maneuver a determined hacker, but it does let you avoid interacting with a simple list of really irritating sites.
Quote:
Originally Posted by unSpawn
Put differently: given the facts that 0) users get redirected to malware-delivering sites via completely unrelated sites, 1) often using Javascript (as in application vs network layer) and the 2) speed with which criminal organizations set up DNS and malware sites the value of using said (or any such) hosts file therefore will enhance best practices in a measurably negligible way.
So please consider not promoting that hosts file or the use of /etc/hosts. For effective, usable and efficient distro and browser-agnostic filtering you can't beat a filtering proxy.
Other than that the use of a custom hosts file isn't even in the top ten of security measures to take so this kind of distracts from what the OP should do.
I will most definitely stop promoting it as you suggest. Perhaps you could recommend a good how-to guide for a filtering proxy? I've googled a bit but am not sure which sites are trustworthy.
Apologies for suggesting it, but it's better than nothing at all I think.
No need to apologize.
Quote:
Originally Posted by sneakyimp
(..) I have myself noticed a markedly improved user experience visiting them with a hosts file blocking a lot of common ad networks. You are entirely correct that this approach won't help you out-maneuver a determined hacker, but it does let you avoid interacting with a simple list of really irritating sites.
What about when using linux as a desktop only, what would be minimal checklist for security?
"Desktop Linux" may mean anything ranging from like media centre, library or class room PC to development or research workstation so let's define it as a machine that runs no (publicly) exposed / accessible services, may be exposed to different users and resides behind a router.
While threats to stability and security may seem different it makes no difference architecture-wise in what role a Linux machine is used as the basis remains the same: root is the most powerful account and should be protected always. Ergo basic hardening and auditing remain best practices. If you disagree: are threats like a user being able to bypass the OS using bootable media, a system (wrongly) configured with Sudo "ALL=NOPASSWD: ALL", running networked services via virtualization software, running development-grade software like XAMPP etc, etc, that different? And does residing behind a router make a difference if a user can trigger PnP in the router and bypass not forwarding ports except for already established connections?.. Oh, and by the way: security isn't a check list or using a single tool but it should be layered and a continuous process of auditing and adjusting.
In short:
- your distributions installation documentation (install only what you need, disabling services, backups, etc, etc),
- your distributions security documentation (strong passwords, access restrictions, firewall, auditing, etc, etc),
- any other Linux Best Practices and check lists from SANS ISC, OWASP, Cisecurity,
- minimally tools like GNU/Tiger and Logwatch, preferably Samhain (or AIDE) too,
- any sane unprivileged user settings like disabling Java and Flash, browser plugins like NoScript, etc, etc, and don't forget to
- educate users as that's where most problems will originate :-]
My auditing tool Lynis might help you with that as well. It does an extensive check of your system and give you suggestions on what you can improve regarding security.
It's suitable for servers and desktops. unSpawn already gave a lot of good tips, which go hand in hand with automatic testing. After all, it's a process, improving one step at a time.
I would start by disallowing "root" user to login through ssh
just this. (and maybe disallow root logins without sudo completely (on ubuntu/debian based systems this is default)
For the rest you can use java, visit sites with javascript, flash and god knows what else.
just do 1 thing, keep your software up to date.
see a notification that updates are available? do the updates.
Do this and it will already take a lot more than a script kiddie to invade your computer.
root is the most powerful account and should be protected always
Then the question is what is the 2nd most powerful account? Root needs protection for the sake of system integrity, etc. Your main user needs protection because of various secrets it holds in its files. It's also generally a more privileged user than needed for many things (e.g. mine has groups... cdrom sudo audio video plugdev netdev wheel). So I like to do general browsing, movie watching, music... basically anything consuming inputs from untrusted sources using a 3rd account. It's a little awkward, but not so bad (in fact so easy I now have a 4th account) if you can get used to launching programs from xterms and have a window manager that's good at grouping windows into different sets.
I have this alias to start an xterm for launching unprivileged programs:
alias unpriv='xhost +si:localuser:unprivileged_user; su -l unprivileged_user -c xterm'
If you're really good maybe you can figure out a solution with xauth and the X security extension or whatever that other thing that followed it was. The hole above (aside from kernel bugs giving privilege escalation and accidents of configuration) is that X clients are all completely visible to each other. So I feel safer that firefox having an exploit might be contained within my unprivileged user's environment, but it's maybe false security, since it might not be so hard to get code running to iterate the X clients looking for goodies or injecting key strokes.
Some I've talked to think my approach awkward and suggest virtualization for unprivileged browsing (to me that's much more awkward, but whatever). In that case can you share the X clipboard securely? You're going to want the clipboard. (I've been through trying to convince myself I don't need it when attempting to use the X security extension... nah, you'll want it.) So you'll probably connect your virtual machine to the same X server that your other clients are on and hit the same issue. I think most any solution will hit up against X until somebody does something like the X security extension in a useful way or X gets replaced by wayland (assuming they do clipboard etc. relatively securely).
Still, maybe most exploits don't go the extra mile to get other X clients. Running this way isn't that common, so it ought to still buy you something.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.