Linux - SecurityThis forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
View Poll Results: Should super patches be allowed?
I came across an interesting idea from bugtraq - the super patch. This is a patch that is issued that resolves a large number of problems. I have not seen much of this in practice in the OS world but I wonder if there is a point here:
Quote:
If each normal patch has a probability P of causing problems, then an N-fold patch has probability (1 - P)^N of NOT causing a problem. Thus the probability is 1 - (1 - P)^N that the N-way patch will have an issue.
For real-world numbers, if P = .1 (10% chance the patch may be
problematic) and N is 10, then the patch has a 65% chance of being a problem. Even if P is .01, there is still a nearly 10% chance of problems from a 10-way superpatch.
------------------------
Bugtraq post by Nicholas C. Weaver
Large patches can cause more problems than they fix?! Whats the best way? A war of attrition (small patches, but they dont break stuff) or all out nuclear patches that aim to fix everything at once and end up with large ammounts of collateral damage?
it sucks when something breaks. i would rather have small patches in large quantities. especially if they are held in a repo [apt/yum/urpmi/etc.]. this way, i dont have to patch software i dont have [a la windows SP]. also easier on the bandwidth usage.
I think the logic in that quote is a little misleading. A "super-patch" is basically just a bunch of individual patches rolled into one. Each patch is still independent of the others. So I think a good analogy would be if you took a handfull of coins and tossed them into the air all at the same time. Compare that with taking the same handfull of coins and flipping them one-by-one. You can quickly see that the chances of getting all heads are still the same either way. Just because you are installing N patches at the same time doesn't somehow change the likelihood that a given patch will be poorly written and break something. Maybe I'm just looking at this differently?
Yeh, service packs are what I have in mind. I do beleive that they are a bad idea. I agree that they are just a series of small patches rolled into one, but if I intstall them one by one, as I have need (due to the software setup) then it is easier to trace what broke and thus report the problem more accurately, thus helping us all.
I like the coin analogy, now say 30% of them land on tails (i.e break my system) and i wanted them on heads, how can i trace which ones broke? all the coins are all in the same package, so i have to keep throwing them and picking them up repeatly, until i get all heads!
I think all software providers should steer clear of the SP model. It can create a huge mess.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.