Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Let's say you're developing software for some special purposes, and you're deploying it on your own machine, on a remote location (perhaps behind a firewall). You're using [GNU-slash-]Linux as your operating system. You have no physical access to the machines, but you do have a remote terminal access like SSH.
What are some best practices for managing and maintaining such machines?
In my experience, there are a lot of things that can make it pretty painful:
- upgrades on popular distributions like Ubuntu can sometimes have unpredictable consequences
- not upgrading your system leaves potential security holes and risks dependency issues if you're upgrading your developed software
- a lot of stuff happens "under the hood" which implicitly requires unrestricted network access (e.g. NTP time synchronization)
- eventual security breaches and/or system issues can reinstall, and therefore physical access
- etc. etc.
I was thinking that maybe a do-it-yourself distribution like Slackware could make the system much more stable, at the expense of not having a dime-a-dozen engineers that are comfortable with the system...
So I'm looking for other people's experience with this.
Whatever's on your mind related to the topic
Last edited by bheadmaster; 08-14-2020 at 06:21 AM.
Reason: Stallman-correctness
There is no "best" practices, this always depends on the goal and the infrastructure. For example we have a stable development environment, there is no upgrade, no "under the hood" happenings and in general everything [every change] is controlled.
Well, I'd say the "stable development environment" and "every change is controlled" counts as best practices?
I know it might sound simple to you, but how do you actually maintain such controlled environment? How do you make sure that nothing happens in the remote machine that is not under your control? And so on...
it is not only me, but a group who maintain and not only one host, but more.
But in different situations there can be different requirements and solutions.
I have helped manage such an environment, although with some differences. The OS was RHEL and CentOS mixed, we had full control of the network, hosts hardware, hosts software, remote access using host control modules that allowed us to reboot, watch the boot, or remote load an OS onto new iron easily.
We had two levels of software: those provided in OS repository, and our own home grown application/serve suites. Our own software was distributed from a unique repository system completely under our control. It was distribution and OS agnostic because we hade clients running their own servers on RHEL, AIX, HP-UX, Solaris, and Windows (where we could not dissuade them, yeah it WAS sad). In this way we controlled the dependency lists, prerequisites, and could move forward or roll back under our full control and at will.
As a network/systems admin I am a bit OCD: if I hold the responsibility then give me the damn keys to the kingdom or go home. I do not need to control everything (and do not WANT to) but I need to SEE everything and be able to control everything I need to do the job.
It has been two years since I had to manage more than a couple of machines remotely in anything like that intense or critical a situation, but at that time I would not have considered Ubuntu suitable. My options were stable, long term distributions well supported for corporate operations from the SUSE and RHEL families with excellent support form the hardware vendor.
If you do not and cannot control your infrastructure, then you can never consider your machines secure. For one thing, physical access trumps every software security measure known to man.
For an application development and distribution situation it is hard to beat configuring your own packaging and distribution to ensure that every prerequisite and requirement is met (or will be met during installation/update) BEFORE any software or configuration changes are deployed.
One key point: the term "best practices" is a phrase coined by vendors who want to sell you on THEIR solution. Usually to make a $! You do what works, will work for the longest time without breaking, will work to satisfy the business and security requirements, and that you can control and support without pain going forward. If you have a team, pick their brains and get some level of agreement of requirements, standards, and architecture before you configure of code anything you might have to live with for years. Rushing is tempting, but can lead to disaster.
Finally, do not lean too much on the advice of people who do NOT understand every fine detail of your situation. That includes me. We at LQ can give you lots of good advice, advice that might lead you to the WRONG solution because of things we do not and cannot know. We do not have to live with your solution, but YOU do!
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.