[Slackware security] Potential backdoor risk in Rust
SlackwareThis Forum is for the discussion of Slackware Linux.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
If you do not trust a rust compiler that is written in rust, you can go back all the way when rust was still written in ocaml.
That's a bad argument. No one is doing that. It would be great to do that, but no one is. Because you said "you can", I infer that Slackware isn't doing that.
You do realize that Slackware is distributed as prebuilt binaries, right?
If Patrick wants to do it right, he should
1. compile rust going back to the original ocaml (as someone suggested)
2. he should be aiming for reproducible builds like other distros have.
Many years ago, when I was a fledgling n00b, I was horrified to realize Slackware ran 'fortune'
after I logged in. How presumptuous! I didn't ask it to do that!
I switched to Debian for a while, until I realized the error
was in my thinking, and not in Slackware's way of being.
Why is it that Slackware seems to attract so many complainers? Like moths to a flame?
Let your callow newbiness burn off, Jun 2020 resolver. Purify your mind with knowledge of how
things actually work! Embrace the Slack, or be off! While many here will entertain your crying,
as they do so many other whiners, you will not be better for the fight.
You have heard from developers of this OS, in this very thread, and yet you persist in your folly.
Alas, in the words of William Blake: "If the fool would persist in his folly he would become wise." ...forget everything I said. Carry on!
That's a bad argument. No one is doing that. It would be great to do that, but no one is. Because you said "you can", I infer that Slackware isn't doing that.
Again, Slackware is not a from-source distribution. Packages get rebuilt using the binaries presently available. That means, there is an original rust package which was written in ocaml and compiled into a package. That package got updated several times, and each time the new rust compiler was built using binaries that were compiled on Slackware earlier. There's a chain of trust there which you refuse to see or maybe you are simply unable to grasp this.
I pity you, spending all that time chasing shadows. Good bye.
No, it doesn't. Did you never take a course in logic?
Here's a hint:
Quote:
Originally Posted by dugan
Anyway, the initial premise of your FUD, that the process for building Rust on Slackware necessarily involves downloading upstream compiler binaries, has already been demonstrated to be false.
Yes I did.
Now, would you like to use logical postulates and axioms to deconstruct the point you were responding to? Or is your learned analysis limited to "no it doesn't?"
I mean that's not the only massive problem with your logic that's been pointed out. Your response to that has been a lot of goalpost-moving.
And anyway, since this point has not been rebutted ("no it doesn't" is not a rebuttal), it's up to you to prove that an actual backdoor exists. Not a "potential" one standing in the empty space where a real one would be. Which is the same as "no" backdoor.
Quote:
Originally Posted by STDOUBT
Why is it that Slackware seems to attract so many complainers?
Because trolls register new accounts after getting banned.
Yes, and clever malware in the first compiler could propagate itself into the built compiler. The type of attack where a malicious compiler inserts an exploit into an executable that it builds was demonstrated to be viable decades ago.
I don't want this point being dropped yet.
Resolver, do you have a problem with other compilers that are written in the languages they compile? Or are you singling out Rust because it's the only one you're aware of?
Yes, and clever malware in the first compiler could propagate itself into the built compiler. The type of attack where a malicious compiler inserts an exploit into an executable that it builds was demonstrated to be viable decades ago.
I was surprised when I looked over the thread and didn't see a link for this. I'll post it for you:
If this is your actual concern, then you should have stated that in your top post instead of gesturing about "blindly downloading untrusted binaries", which obviously isn't the same thing. It also makes absolutely no sense for you to single out Rust. I mean, unless you actually think Rust is the only language whose compiler is written in itself...
The fact that this has literally nothing to do with the prebuilt compiler binaries that you complained and complained about, is explicitly pointed out here:
Doesn't Ken's virus depend on the fact that the compiler (a pre-existing binary version thereof) was used to bootstrap itself?
Quote:
No.
If you want a distro that mitigates this with reproducible builds, which you only started mentioning now, then I have no idea why you're using this one.
If Patrick wants to do it right, he should
1. compile rust going back to the original ocaml (as someone suggested)
2. he should be aiming for reproducible builds like other distros have.
If you're so inclined to do by your right, maybe contribute patches for the fully reproducible Rust build?
To me, this is all security rhetoric. Security is about minimising risk, not elminating it.
At the end of it all, you have to trust *someone* or *something* and roll with it.
Why does anybody trust *anything* - source, binaries from any vendor? because they digitally sign it?
Why does someone in the Debian team choose to trust a new developer just because they meet them in person in the pub? It's a choice.
Why does a company choose to hire someone to represent them, when that person could lose them business just by acting poorly with customers, or writing bad code that brings their platform down or causes an outage for a customer,losing them millions of $ an hour? They assess the risk. It's all the same thing to me, just different variables involved in the risk assessment process.
Does anybody reverse engineer the binaries to determine whether the source shipped is really the one used to build the binary? I can't even imagine that it'd be possible since compilers create different code based on the optimization level, the architectural nuances and the compiler version itself.
Why does anybody download and use the packages I push out for Slackware ARM? I'm just some guy sitting in my garden office listening to Bach whilst reading about routing preferences in Azure cloud, but people do use it and I assume, either trust that what I put out is safe to use, or they don't think about it.
Why do I trust the build scripts on Slackbuilds.org? because I look through them briefly but I don't extensively study them: I trust the guys that run it to take care of that stuff; but I harbour the risk and take the responsibility myself at the expense of figuring out how to build something myself.
Same with alienBOB's repo - I trust him so I use some of his packages without question.
You do realize that Slackware is distributed as prebuilt binaries, right?
Your reply missed the point.
Every package in Slackware is an "untrusted" binary by the standards you seem to be using. I invited you to clarify your definition of "untrusted" and you simply ignored that.
True, but in this case, it's applied to a script, that you can read, and analyze.
Have you SEEN that script? The very first line makes DIRECT reference to UNTRUSTED bash, and if you look closer it invokes a host of UNTRUSTED binaries like rm, cp, tar, cat, chown, sed, find.. the list goes on, what a DISASTER! The fact is it's time to shut the whole operation down.
To me, this is all security rhetoric. Security is about minimising risk, not elminating it.
At the end of it all, you have to trust *someone* or *something* and roll with it.
Why does anybody trust *anything* - source, binaries from any vendor? because they digitally sign it?
Why does someone in the Debian team choose to trust a new developer just because they meet them in person in the pub? It's a choice.
Why does a company choose to hire someone to represent them, when that person could lose them business just by acting poorly with customers, or writing bad code that brings their platform down or causes an outage for a customer,losing them millions of $ an hour? They assess the risk. It's all the same thing to me, just different variables involved in the risk assessment process.
Well, for you, as a technically trained professional, understanding the concept of security, its purpose and its fundamental role for building trust can be limited. There's nothing to blame here, it's understandable and I'm replying to your statements because I believe that you, in your capacity as one of the core distro devs, could project your limited understanding on the community.
Minimizing risk is only one of the purposes of security, the lower end of the options spectrum. I'd like to add: the most incompetent/superficial one, even from a trade-off perspective, the lowest effort.
Security is what trust is based/built upon and you definitely don't have to trust *someone*, but *something*. That something is called science and the particular field is called security. While focusing on science, on organizational/business level you have Security Management as a standalone professional branch (not limited only to technology). Not all businesses (size/specifics) have a specialized Security Management organizational unit, but its functions are applied throughout the organization where required (pretty much everywhere).
Now, just for some background, I'd like to summarize what Management means (definition/purpose/"abstraction"), without going through its classifications and functions:
The identification of resources (money/time/competencies) and organizing/orchestrating these to achieve an objective (resolve a problem) in an efficient (optimal/best effort, no waste) and effective (working solution) way.
On your 3 why's:
- the security (signing - use of crypto) in the SW distribution systems offers trust (anti-tampering verification) and the fact that the source code is open and can be audited also offers trust (open for inspection - "manual" anti-tampering verification). An automated Source Code Management system would also improve the security (and its resulting trust) for the latter.
- meeting someone in person is part of the investigation (observation and assessment) and it's an opportunity to also capture non-verbal communication. It's part of the Personnel Management branch, implemented in all organizations (HR - Recruiting - Interviews), it requires Human Science competencies and it's also a clear security function.
- already covered in the previous section. I'd like to add that it's a ludicrous view to believe that there are no security mechanisms (decisional) available to implement in an organization to limit the potential damage of an individual. There are even formal rules (legislation) developed and continuously improved by gov. institutions. And I worked in different capacities on mission critical systems, had to both design and approve technical procedures, also send them to the customers for approval, in order to asses and mitigate a potential the risk, not only "minimizing" it. But then, gradually the greenhorns (the brainwashed "flat organization(flat earth?)" and "agile"-irresponsible disturbed) took over constantly eroding (hacking) these security procedures "the established bureaucracy" and now you get overwhelming reports about major security breaches/flaws and about how insecure the actual IT systems are. No need to buy comic books anymore, just read the papers. (Very) Competent technical people handling subjects they had no professional competency about. Well, some got a "management training" and "certification" from one of these "demand generation" private pseudo-institutions that are making fortunes from fooling people that a brief training (studying a book) will make them competent. BTW, speaking from experience, when I was really young I read a book and thought I knew everything
Being a "minimalist" about the security role, focusing only to minimize the risk is incorrect. Optimal/best effort is required instead.
Of course, I'm not living in an ideal world, I just wanted to make clear that the science and experience is there, available, publicly. If it's not considered and implemented, due to bad will, lack of commitment, hidden agendas, incompetence, etc. it's not the science to blame. All these are representing hacks (social) on the organizational security system (incompetence being the worse). Yet another incentive to stay vigilant, care about security and not fall into the "frustration pitfall" - "It's all the same thing to me".
If you ask yourself why I invested the effort to detail/clarify on your statement, it's because I care. Not necessarily about you, but about Slackware and its community. Nothing personal, just plain formal (scientific).
______________ end of reply_______
resolver has a valid point, not really well sustained I must admit, and, unfortunately generally applicable in the SW world, not only affecting the Slackware distro.
There are now security enabled SW management systems that can (and are sometimes) interchained:
- Source Code Management systems that are already employed by the majority of the SW projects. Slackware lacks one, although there is a git setup by AlienBob (hope I'm not wrong), mirroring the changes, but not used for the actual development
- Automated Build Systems, AFAIK not handling integrity but only implementing access control. Slackware lacks this too and has a sort of security by obscurity implementation. These build systems, including compilers are the ones that bother me too and I don't have a technical solution to offer, because the integrity checks infrastructure should be implemented upstream at the toolchain devs first. How? I don't have a clear solution either, but some verifiable security orchestration of gradual signing of the intermediary build stages (some blockchain) and resulted binaries, together with kernel security (memory protection checks comes to mind) and dependent lib checks (which in turn should have been signed during their own building), could make up for a better integrity check solution at this level. Food for thought for competent technical people
- Distribution Systems (digital signatures) - these are widely available and have a "sufficient" security model.
For the ones that seek to learn more about trust, Bruce Schneier has some good explanations: https://www.schneier.com/news/archiv...ier_on_tr.html
- he's a security professional, unlike me, holding a business degree. Better trust him
And related to the trade-offs we are doing - including me: I don't blindly trust the Slackware distro, but I do trust the security of the digital signatures in the distribution system, I can inspect the open source build scripts, rebuild my own security related packages (which I do, and maintain them on my own) and this is my best effort as an individual user. Bootstrapping and using LFS (or Gentoo) would be my alternative, but I don't have the resources (mainly time) to do it and therefore I'm OK with Slackware. That's only security/trust focused. Obviously there are other things I like and appreciate about Slackware. However, I'm constantly vigilant and investigative, questioning things and not being a minimalist in my approaches
Again, Schneier on the security mirage/trade-offs: https://www.youtube.com/watch?v=NB6rMkiNKtM
(I don't have any connection to Schneier, I simply find him a good and neutral professional (well, he has his own agenda - selling his books, but those are also good), well spoken/articulate and owning a sound logic).
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.