LinuxQuestions.org
Help answer threads with 0 replies.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Security
User Name
Password
Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.

Notices


Reply
  Search this Thread
Old 07-04-2014, 03:05 PM   #1
${i}
LQ Newbie
 
Registered: Jul 2014
Posts: 22

Rep: Reputation: 9
How to do a basic audit of a linux desktop?


hello LQ members

I'm not an expert on security but I know if use linux as a server especially a public server you'll need to be proactive in security.

What about when using linux as a desktop only, what would be minimal checklist for security? Thanks
 
Old 07-04-2014, 04:00 PM   #2
coralfang
Member
 
Registered: Nov 2010
Location: Bristol, UK
Distribution: Slackware, FreeBSD
Posts: 762
Blog Entries: 3

Rep: Reputation: 246Reputation: 246Reputation: 246
I would start by disallowing "root" user to login through ssh, and install fail2ban (or better, only use keys to authenticate) -- that's if you have the need to access the desktop through ssh at times.

Make sure all packages are upto date, is quite obvious. And try to only use software from "trusted" sources.

On the desktop, the main attack vector comes from the browser, you could prevent many by:
1. Disable java in your web browser, and only enable it when you really need to.
2. Disable javascript, or better install an add-on such as noscript to manage site specific rules
3. Disable adobe flash, only enable on sites you trust (ie; youtube).
4. Install an adblocker (surprisingly, most payloads will tend to come from hacked ad servers)

For physical access attacks (people that could walk up to your computer);
1. You could use an encrypted filesystem to prevent leakage of data when the hard drive is stolen.
2. Disable automounting or autorunning software when a device is connected (do not prompt when a usb is inserted for example)
3. Use a screen locker that locks accounts after a period of inactivity

It depends how deep you want to go. Obviously, the more you lock down a system, the less usable it becomes. These are meerly some things i personally choose to do when setting up a new machine.
 
1 members found this post helpful.
Old 07-04-2014, 04:59 PM   #3
${i}
LQ Newbie
 
Registered: Jul 2014
Posts: 22

Original Poster
Rep: Reputation: 9
great tips coralfang

Before my post, my security checklist was a firewall, clamav, disabling startup services that I don't need and rkhunter. Haven't though much on the browser end so I will supplement those suggestions to my checklist. Thanks

+1
 
Old 07-04-2014, 09:16 PM   #4
sneakyimp
Senior Member
 
Registered: Dec 2004
Posts: 1,055

Rep: Reputation: 78
Consider using this hosts file. It will prevent your computer from accessing hundreds of malicious/invasive websites. It will also prevent the display of many ads.

I was encouraged by some of the gurus here to install tiger and run it. On Debian or Ubuntu, this is as simple as typing "sudo apt-get install tiger"

Theoretically, improving security generally means a few things:
1) Use a secure form of authentication for anyone who must connect. I usually disable password authentication and require a key pair instead.
2) disable any services you don't need which listen for connections
3) for the ones you do need, try to limit who can connect to it using configuration settings (e.g., localhost connections only), iptables, or a firewall.
4) practice safe internet habits when browsing.
 
Old 07-05-2014, 04:56 AM   #5
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,409
Blog Entries: 55

Rep: Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582
Quote:
Originally Posted by sneakyimp View Post
Consider using http://winhelp2002.mvps.org/hosts.htm]this hosts file. It will prevent your computer from accessing hundreds of malicious/invasive websites. It will also prevent the display of many ads.
Deities know why people keep promoting it... Using /etc/hosts for redirecting lookups is a crude, past millennium way of dealing with things. To be more specific it won't help with or can't parse Javascript fun or do path-based filtering, update incrementally, block ad-tracking cookies, block in-page ads residing in a path on the same server you visit, block ads from a hostname of which the domainname is the same as the server you visit, block ads presented through Javascript or Flash, block ads by host or path substring match, block only web bugs, set session-only cookies for a range of sites, selectively block popups, refresh-tags and redirects, keep images with specific sizes from displaying or block visiting domains based on content. On top of that you'll have a hosts file filled with sites you might not even visit ten per cent of, no insight in how filters get added and nobody to vouch for its contents other than this one person.

Put differently: given the facts that 0) users get redirected to malware-delivering sites via completely unrelated sites, 1) often using Javascript (as in application vs network layer) and the 2) speed with which criminal organizations set up DNS and malware sites the value of using said (or any such) hosts file therefore will enhance best practices in a measurably negligible way.

So please consider not promoting that hosts file or the use of /etc/hosts. For effective, usable and efficient distro and browser-agnostic filtering you can't beat a filtering proxy.


Other than that the use of a custom hosts file isn't even in the top ten of security measures to take so this kind of distracts from what the OP should do.
 
1 members found this post helpful.
Old 07-05-2014, 07:09 PM   #6
sneakyimp
Senior Member
 
Registered: Dec 2004
Posts: 1,055

Rep: Reputation: 78
Quote:
Originally Posted by unSpawn View Post
Deities know why people keep promoting it... Using /etc/hosts for redirecting lookups is a crude, past millennium way of dealing with things. To be more specific it won't help with or can't parse Javascript fun or do path-based filtering, update incrementally, block ad-tracking cookies, block in-page ads residing in a path on the same server you visit, block ads from a hostname of which the domainname is the same as the server you visit, block ads presented through Javascript or Flash, block ads by host or path substring match, block only web bugs, set session-only cookies for a range of sites, selectively block popups, refresh-tags and redirects, keep images with specific sizes from displaying or block visiting domains based on content. On top of that you'll have a hosts file filled with sites you might not even visit ten per cent of, no insight in how filters get added and nobody to vouch for its contents other than this one person.
Apologies for suggesting it, but it's better than nothing at all I think. It certainly has blocked a lot of unpleasant ads for me. Certain sites have very intrusive ad behaviors (ahem cough cough) and I have myself noticed a markedly improved user experience visiting them with a hosts file blocking a lot of common ad networks.

You are entirely correct that this approach won't help you out-maneuver a determined hacker, but it does let you avoid interacting with a simple list of really irritating sites.

Quote:
Originally Posted by unSpawn View Post
Put differently: given the facts that 0) users get redirected to malware-delivering sites via completely unrelated sites, 1) often using Javascript (as in application vs network layer) and the 2) speed with which criminal organizations set up DNS and malware sites the value of using said (or any such) hosts file therefore will enhance best practices in a measurably negligible way.

So please consider not promoting that hosts file or the use of /etc/hosts. For effective, usable and efficient distro and browser-agnostic filtering you can't beat a filtering proxy.


Other than that the use of a custom hosts file isn't even in the top ten of security measures to take so this kind of distracts from what the OP should do.

I will most definitely stop promoting it as you suggest. Perhaps you could recommend a good how-to guide for a filtering proxy? I've googled a bit but am not sure which sites are trustworthy.
 
Old 07-06-2014, 03:47 AM   #7
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,409
Blog Entries: 55

Rep: Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582
Quote:
Originally Posted by sneakyimp View Post
Apologies for suggesting it, but it's better than nothing at all I think.
No need to apologize.


Quote:
Originally Posted by sneakyimp View Post
(..) I have myself noticed a markedly improved user experience visiting them with a hosts file blocking a lot of common ad networks. You are entirely correct that this approach won't help you out-maneuver a determined hacker, but it does let you avoid interacting with a simple list of really irritating sites.
No, this isn't about determined hackers. Check the points I mention here: https://www.linuxquestions.org/quest...9/#post3585760 and see if /etc/hosts allows for that. (No it doesn't.)


Quote:
Originally Posted by sneakyimp View Post
Perhaps you could recommend a good how-to guide for a filtering proxy?
In addition to sane browser configuration and minimally NoScript please see http://www.privoxy.org/user-manual/index.html from the privoxy.org web site.


Now let's get back to the topic at hand...
 
Old 07-06-2014, 06:39 AM   #8
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,409
Blog Entries: 55

Rep: Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582Reputation: 3582
Quote:
Originally Posted by ${i} View Post
What about when using linux as a desktop only, what would be minimal checklist for security?
"Desktop Linux" may mean anything ranging from like media centre, library or class room PC to development or research workstation so let's define it as a machine that runs no (publicly) exposed / accessible services, may be exposed to different users and resides behind a router.

While threats to stability and security may seem different it makes no difference architecture-wise in what role a Linux machine is used as the basis remains the same: root is the most powerful account and should be protected always. Ergo basic hardening and auditing remain best practices. If you disagree: are threats like a user being able to bypass the OS using bootable media, a system (wrongly) configured with Sudo "ALL=NOPASSWD: ALL", running networked services via virtualization software, running development-grade software like XAMPP etc, etc, that different? And does residing behind a router make a difference if a user can trigger PnP in the router and bypass not forwarding ports except for already established connections?.. Oh, and by the way: security isn't a check list or using a single tool but it should be layered and a continuous process of auditing and adjusting.

In short:
- your distributions installation documentation (install only what you need, disabling services, backups, etc, etc),
- your distributions security documentation (strong passwords, access restrictions, firewall, auditing, etc, etc),
- any other Linux Best Practices and check lists from SANS ISC, OWASP, Cisecurity,
- minimally tools like GNU/Tiger and Logwatch, preferably Samhain (or AIDE) too,
- any sane unprivileged user settings like disabling Java and Flash, browser plugins like NoScript, etc, etc, and don't forget to
- educate users as that's where most problems will originate :-]
 
3 members found this post helpful.
Old 07-06-2014, 02:32 PM   #9
${i}
LQ Newbie
 
Registered: Jul 2014
Posts: 22

Original Poster
Rep: Reputation: 9
Thanks to all here for the suggesions
 
Old 07-30-2014, 10:25 AM   #10
mboelen
LQ Newbie
 
Registered: Nov 2013
Location: The Netherlands
Distribution: Several ones for testing purposes
Posts: 15

Rep: Reputation: Disabled
My auditing tool Lynis might help you with that as well. It does an extensive check of your system and give you suggestions on what you can improve regarding security.
It's suitable for servers and desktops. unSpawn already gave a lot of good tips, which go hand in hand with automatic testing. After all, it's a process, improving one step at a time.

Happy hardening!
 
1 members found this post helpful.
Old 07-30-2014, 10:35 AM   #11
shardik
LQ Newbie
 
Registered: Jul 2014
Posts: 6

Rep: Reputation: Disabled
Quote:
Originally Posted by coralfang View Post
I would start by disallowing "root" user to login through ssh
just this. (and maybe disallow root logins without sudo completely (on ubuntu/debian based systems this is default)

For the rest you can use java, visit sites with javascript, flash and god knows what else.
just do 1 thing, keep your software up to date.
see a notification that updates are available? do the updates.

Do this and it will already take a lot more than a script kiddie to invade your computer.
 
Old 08-20-2014, 03:19 PM   #12
thirdm
Member
 
Registered: May 2013
Location: Massachusetts
Distribution: Slackware, OpenBSD
Posts: 154

Rep: Reputation: Disabled
Quote:
Originally Posted by unSpawn View Post
root is the most powerful account and should be protected always
Then the question is what is the 2nd most powerful account? Root needs protection for the sake of system integrity, etc. Your main user needs protection because of various secrets it holds in its files. It's also generally a more privileged user than needed for many things (e.g. mine has groups... cdrom sudo audio video plugdev netdev wheel). So I like to do general browsing, movie watching, music... basically anything consuming inputs from untrusted sources using a 3rd account. It's a little awkward, but not so bad (in fact so easy I now have a 4th account) if you can get used to launching programs from xterms and have a window manager that's good at grouping windows into different sets.

I have this alias to start an xterm for launching unprivileged programs:

alias unpriv='xhost +si:localuser:unprivileged_user; su -l unprivileged_user -c xterm'

If you're really good maybe you can figure out a solution with xauth and the X security extension or whatever that other thing that followed it was. The hole above (aside from kernel bugs giving privilege escalation and accidents of configuration) is that X clients are all completely visible to each other. So I feel safer that firefox having an exploit might be contained within my unprivileged user's environment, but it's maybe false security, since it might not be so hard to get code running to iterate the X clients looking for goodies or injecting key strokes.

Some I've talked to think my approach awkward and suggest virtualization for unprivileged browsing (to me that's much more awkward, but whatever). In that case can you share the X clipboard securely? You're going to want the clipboard. (I've been through trying to convince myself I don't need it when attempting to use the X security extension... nah, you'll want it.) So you'll probably connect your virtual machine to the same X server that your other clients are on and hit the same issue. I think most any solution will hit up against X until somebody does something like the X security extension in a useful way or X gets replaced by wayland (assuming they do clipboard etc. relatively securely).

Still, maybe most exploits don't go the extra mile to get other X clients. Running this way isn't that common, so it ought to still buy you something.
 
1 members found this post helpful.
  


Reply

Tags
dont_use_a_hosts_file


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
How can I read the audit time stamp? msg=audit(1213186256.105:20663) abefroman Linux - Software 3 04-21-2011 07:37 PM
No BART (Basic Audit Reporting Tool) Binary felix001 Solaris / OpenSolaris 3 01-11-2011 05:52 AM
[Linux Audit]: Which groups should be allowed to read audit log files? quanba Linux - Security 1 11-15-2010 11:09 AM
error in line 5 of /etc/audit/audit.rules RHEL5u3 abti Red Hat 1 04-06-2010 06:42 PM
Very basic Linux desktop required (web+email+wireless) vinny23 Linux - General 6 02-18-2008 04:07 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Security

All times are GMT -5. The time now is 03:24 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration