[Slackware security] Potential backdoor risk in Rust
SlackwareThis Forum is for the discussion of Slackware Linux.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
It's overly generous to attribute this to a lack of command of the English language (another poster was allowing him the benefit of the doubt.) Both the essential overt message and the subtext come through loud and clear.
Agreed. I find abga's English perfectly good and quite frankly if they hadn't said so, I would have assumed native speaker.
Again, Slackware is not a from-source distribution. Packages get rebuilt using the binaries presently available. That means, there is an original rust package which was written in ocaml and compiled into a package.
If no talented hacker or state actor was able to inject any one of those binaries with a persistent malware during that chain of compilation, fine, but if they did, all future rust compilers and/or libraries could have that malware.
It's like you're cooking with a pot you never clean and you're proud of it, and if anyone questions you you sneer and walk away.
OK, here's what you said: To say that there's a "potential" backdoor is just a scary FUD way to say to say that there is "no" backdoor; it means exactly the same thing.
But if you think about it, potential is a probability > 0, whereas "no" is a probability of 0, these are by definition not the same thing.
You're making more of an evangelical argument of certainty with no proof to back it up except your blind faith, or perhaps fear and denial.
If no talented hacker or state actor was able to inject any one of those binaries with a persistent malware during that chain of compilation, fine, but if they did, all future rust compilers and/or libraries could have that malware.
I don't think anybody is arguing with this point. It's been pointed out already that this is a known hole in the whole chain of trust.
Quote:
It's like you're cooking with a pot you never clean and you're proud of it, and if anyone questions you you sneer and walk away.
No, this assertion also doesn't hold water (no pun intended!). With a dirty pot, you can clean it.
With what we're discussing here, this is not the case: we're *aware* of what you are saying (and I and many others in this thread acknowledge it and agree with it!), and pointing out the underlying principles and practicalities of obtaining software over the Internet from which one cannot escape.
I cannot see why this is not being comprehended.
To me, this is all security rhetoric. Security is about minimising risk, not elminating it.
You can minimize some risks and eliminate others.
1. Eliminating risk
Example: Removing the flash player eliminated scores of potential malware attacks.
2. Minimizing risk
To minimize you have to avoid reckless practices and taking risks. The decision of the Rust community to download binaries without asking permission, even if it doesn't apply to the way Slackware is actually built, shows a recklessness and riskiness like that of a drunk driver. The fact that mozilla thinks using Rust for Firefox is a good idea puts their commitment to protecting their users in doubt.
The fact that mozilla thinks using Rust for Firefox is a good idea puts their commitment to protecting their users in doubt.
Given everything discussed here, the only thing extra I have to say on this is I wish they'd stop using rust because of a number of reasons.
1. The unstable ABI.
2. The complexity that exists within Firefox (and other projects) to try and map the rust architecture trip/quadlet names into the auto-tools versions. It's total madness to me, and only seems to affect non-x86/64 architectures.
Here is the thing; almost any attacker is going to take the path of least resistance.
Some would take the opposite approach: If they can expend a relatively large effort just once ("one and done") in order to gain a persistent compromise, some will.
Recall that some attacks have gone unnoticed for months and years. Attackers, especially governments, want to be on the most computers for the longest period of time. Linux can't be left out of that so they will seek a way in.
In the case of a computer language, the upfront effort would be technical to gain the compromise, and then after that it would be social, encouraging "useful idiots" (a term of art) to become fanboys and promote the language despite public resistance.
Slackware is not LFS, yet it still also has all the sources visible.
So, you can theoretically build it from scratch, too, with some elbow grease on your end.
It should be possible for everyone to rebuild the distro, get hashes for every txz file, and when they compare their hashes they should be identical.
That's the concept of reproducible builds. It's just a simple concept that source tarball X always gives you binary Y.
Open source doesn't mean much if the binaries that are offered online don't match those that you build yourself.
Reproducible builds have the potential to protect against fraud, as when some project says "these are the sources we use" but actually they're patching it with malware because they want to, or some organization is pressuring them to.
You're making more of an evangelical argument of certainty with no proof to back it up except your blind faith, or perhaps fear and denial.
Quote:
Originally Posted by resolver
So says the person who issues ad hominem attacks. Pot kettle black.
* cough
...
I'm going to put you on my ignore list now. If you come back with taunts about cowardice or surrender (as I suspect your first impulse will be), then you will simply be proving that you've always intended to be the troll you've been acting like.
It should be possible for everyone to rebuild the distro, get hashes for every txz file, and when they compare their hashes they should be identical.
That's the concept of reproducible builds. It's just a simple concept that source tarball X always gives you binary Y.
Open source doesn't mean much if the binaries that are offered online don't match those that you build yourself.
Reproducible builds have the potential to protect against fraud, as when some project says "these are the sources we use" but actually they're patching it with malware because they want to, or some organization is pressuring them to.
And what system are you going to be building on? a Slackware system? If this is the case, this highlights the fundamental problem you're referring to.
How do you know the system you're building the packages on (even if they give the same hashes) isn't already compromised.
This goes for any distribution, hence why I pointed out the issues with Symantec, Intel..
How do you know the food you eat isn't poisoned? Do you trust your supermarket/shops/whatever?
Surely the key to Slackware security is the slackbuild. With other binary distros, you have to take the downloaded package on trust. Essentially what you download is a blob. Yes, you have access to the source (GPL requires that) but, as resolver says, a single source package can generate multiple binaries depending on how the build was configured.
With source-based distros like Gentoo and Crux, you only download source code, which is plain text. No blobs! But you pay for that with long, slow builds. At least that's what happens when you use hardware like mine.
Now with Slackware, you have access to both the source and the slackbuild. That means you can build it yourself, knowing that the result will have the same functionality as the binary built by Pat and his team. In fact, once slack is installed, there's nothing to stop you from rebuilding each and every package if you're paranoid. You have the convenience of binary with the security of source.
As to FF using rust, I don't like it but I can see why they did it. The switch to rust hugely speeded up the running program. I remember being amazed at the difference the first time I used a rust-based version.
And btw, rustc isn't the only compiler that can only be built by itself. The same is true of gcc. Apparently other C compilers can't build it. But I haven't noticed anyone complaining about that.
And btw, rustc isn't the only compiler that can only be built by itself. The same is true of gcc. Apparently other C compilers can't build it. But I haven't noticed anyone complaining about that.
I've pointed that out multiple times. Resolver has made a point of ignoring it. (Obviously because acknowledging it would damage his position).
Personally, I'm (provisionally) happy to trust the various moving parts in the Slackware project. It strikes me that this is at the end of the day a personal choice.
And what system are you going to be building on? a Slackware system? If this is the case, this highlights the fundamental problem you're referring to.
How do you know the system you're building the packages on (even if they give the same hashes) isn't already compromised.
Is it possible that working at a security company make someone give up all hope and become cynical? Like you feel you're trying to achieve good hygiene in a truck stop bathroom?
Generating a distro is a higher duty than keeping your personal computer clean. It is more like keeping a biosafety lab clean. It has to be done right. Minimize or eliminate all risks. Wipe the build computer's hard disk before starting. Install only the bare minimum, most carefully examined software to perform the build. Disconnect the Internet.
If a regimen of good computing hygiene is instituted, instead of shrugging it off as too hard, not worth my time, not my problem, impractical, then there is a real chance of improving security. It's when people take a bad attitude about following proper procedures that you hear in the news about 30 people getting food poisoning at a restaurant, or bridges collapsing, or any other number of disasters that result from shirking responsibility.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.