Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Another point for dynamic linking: a function is loaded into memory only when it's needed by the program, instead of a statically compiled program that loads the library code in its memory. (Ok, everyone has 4+ GB on his pc... but that isn't a good reason at all).
Add: if there are two program that need of the same library function, the library is loaded once and its memory is shared
Last edited by gengisdave; 11-06-2014 at 03:29 PM.
If you're trying to build your own debs or RPMs, yes. If you're just doing ./configure ; make ; make install or building for a distribution that doesn't enforce dependency versions (Slackware, Arch, Gentoo, etc) then no. The build system detects the dependencies that are present on your system.
In my mind, that last part should read "Complains and throws up vague errors about missing dependencies on your system."
Then its "Oh my bad, I compiled that library and forgot to run ldconfig. Sorry ./configure!"
Last edited by szboardstretcher; 11-06-2014 at 03:31 PM.
Yeah yeah yeah. I never said I was good at compiling...
My point being,. if the source archive just 'came with' everything it needed, libraries included, then it wouldn't be such a ... troubleshooting process ... to compile and install software.
Last edited by szboardstretcher; 11-06-2014 at 03:41 PM.
My point being,. if the source archive just 'came with' everything it needed, libraries included, then it wouldn't be such a ... troubleshooting process ... to compile and install software.
You'll take you time to download huuuuuuge source packages
It is in the design phase, so its mostly theory ATM. I'll believe it when I see it.
Yeah, that's why I said interesting information. I don't think it's made much progress since the last time I looked a couple of years ago. And I don't expect much in the future: lots of people are interesting in complaining about complicated software, a lot less are interested in writing simpler software.
You'll take you time to download huuuuuuge source packages
Having used Docker and created quite a few images,. I think you'd be surprised how small the needed packages would be for most software. I'd say most range between 1M at the smallest and 300M at the biggest.
Luckily we are in the broadband generation and getting a 300M file is trivial. I use 300M watching 5 minutes of Netflix, or browsing the internet for 10 minutes.
Quote:
lots of people are interesting in complaining about complicated software, a lot less are interested in writing simpler software
One advantage of the shared libraries is that it enable you to get such a surprising amount of software onto an installation disk. And while some of us are "in the broadband generation", that isn't true of most of the world. Even apart from the cost, I wouldn't like to try downloading a distro over a satellite connection!
From distributions' point of view shared libraries save disk space and RAM. Also maintenance is much simpler - only one copy of the
library should be updated when the need arise.
But from the point of view of user that need install version of software different from the one provided by distribution or from
software vendor who is want that his binary will work across different version of different distribution the first concern is just to make it work. Not disk space saving, not security. Just make software works.
And it's not that one POV is correct and another is wrong - it's just different groups of people with different priorities.
And by the way, for those from second groups that want create portable executables following tools may be of use:
- Statifier (http://statifier.sf.net)
- Ermine (http://magicermine.com)
Can't one build all one's apps statically? Can't one invent one's own distribution, 'Static Linux'? I'm a Slacker (and have no dependency problems and never build apps statically.) but it would seem that this could be a new branch of 'Linux from Scratch'.
You're right, I don't have the stats to support that. Anyway, it's off-topic and I'm sorry I mentioned it at all.
Quote:
Originally Posted by nbritton
Linux developers apparently never got the memo that users want simple application installs.
Linux developers response has been to use package managers instead.
Quote:
Originally Posted by szboardstretcher
Pull in what you need - make it work - distribute it. Stop with the 'I need this one 2 line function from version 12.83.1323.99a-b of foo.alpha-x86.el2 and my program will NOT compile without it!' nonsense.
In the title you talked about shared libraries, but then your posts have all been about distributing dependencies with source tarballs. I think that's unrelated to the issue of shared vs static libraries.
Quote:
Originally Posted by szboardstretcher
Luckily we are in the broadband generation and getting a 300M file is trivial. I use 300M watching 5 minutes of Netflix, or browsing the internet for 10 minutes.
Bandwidth isn't free on the server side though. Even so, it's probably more about conventions than any technical reasons. I think packaging deps with tarballs would also have the added benefit of saying "this software has been tested with library version x.y.z". Maybe you can get the GNU people to add this step to their standards: Making Releases.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.