Just for fun: solve the Indipendent software vendors installation woes
Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Just for fun: solve the Indipendent software vendors installation woes
Just a fun topic I've wanted to play with. Indipendant software vendors want the ability to publish one installer that works on all distributions. The purpose of this thread is to post and discuss you're ideas on how to do this.
Remember that there are some very smart people working on this problem and they havn't found the awnser, so its quite likely that there would be some serous flaws in everything posted here, try to take critisum in good humour
I'll go first.
I envisige some kind of install shield aplication that has the ability to interface with all the diffrent package managers, it would do so by calling a binary lets say /usr/bin/unipackage that provides a standard input format (e.g. unipackage install foobar), the binary will then translate that into the distro spesific call, like apt-get install foobar, the installer will search the repositories looking for dependencies, if there are unmeatible dependencies (either because the repos don't have them or because the installer lacks root privalges to use the repos) then the installer will give the option of useing a private shared libary.
One of the major critisums of offering private shared libaries is the large downloads it causes. to address that ISVs will offer 3 downloads (the program that creates installible files will make all 3 at once) one tiny one that scans the repos to see if everything is avalible. One medium one offering no liberies, and one big one offering liberies. If they are shipping CD/DVDs then they can just dispose of the first two and put the big one on the CD/DVD.
The final problem is how to spesificy dependenceis accross distros. I have two ideas, I think the first one is better but I'm open minded:
1) Create a cross distro standard of serial numbers, all package managers need to support searching by these serial numbers in addition to the usual names. only important liberies will get a serial number, if the ISV depends on something without a serial number it is allways provided as a private shared libery. Note that multiple packages can have the same serial number, when this happens it means that all the packages must be installed.
2) ISVs check the dependencies for a list of key distributions (including the BSDs), when the installer is run the appropriate list is used, for non-key distribtions the closest match is used and eqivilent packages are used.
A few final notes:
The installer would use portland to add menu's
A root user would have the option to install menu's for all users, other users can only add for themselves.
The isntaller would let the user chose a folder and put all files in that folder (or subdirs) or it would let the user follow the hirachy. Asuming that he has write permissions of course.
The installer would not be able to edit existing files or otherwise change settings, and it would be extreemly bad form to ask a user to do so without a good reason.
The installer would noisally ask about overwriting any file.
I would think the issue would be that they don't want to make a bunch of apt-get, rpm, tgz, etc. files. They want one file which can be downloaded and will work on any distribution. So it's more than a front-end issue. One binary packages working on all the systems.
And it needs to be smart enough to know how to register itself, resolve dependencies, and correctly locate where it needs to place important scripts (for startup or whatever) if they are needed.
And once you've met all those issues... you're about 90% of the way there. And the other 10% is the tricky stuff.
I would think the issue would be that they don't want to make a bunch of apt-get, rpm, tgz, etc. files. They want one file which can be downloaded and will work on any distribution.
yes that is the issue, the whole idea of this thread is to try and find an awnser.
Quote:
Originally Posted by frob23
And it needs to be smart enough to know how to register itself, resolve dependencies, and correctly locate where it needs to place important scripts (for startup or whatever) if they are needed.
I don't think it should have the right to do that, otherwise you get rediculuos things like in windows where chat software and media players start up automatically. If you want it to start up automaticaly just symlink to the installed application in your ~/.kde/autostart or the gnome equivilent.
Right, that's the point. But your suggestion of having a program which calls apt-get or rpm or whatever means they have one installer but have to duplicate their efforts to make a release for each package manager anyway. It needs to work outside the existing package managers and, at best, know how to register itself with them after the fact.
This is an age-old problem with Unix and Unix-like systems and is one reason /opt was so popular for 3rd party and proprietary applications. And as ugly as /opt can be, it makes some sense. And it also conforms to your comment about using symlinks to bring the program in line with the existing system.
I mean, that may be the solution which wins at the end of the day. It certainly solves a lot of the issues about placement and conflicts. As for libraries, it can test for the existence of them and if they don't exist, download a library into it's directory. So if firefox needed libpng it would be put in /opt/mozilla/firefox2/lib/ and the system would have to regenerate libraries that depend on it.
This system would also need to know if a library it needs is part of another program installed this way (like libpng in the previous example). So if xv was installed in /opt/bradley/xv, and needs libpng, it would need to make sure that it had it's own copy of libpng in /opt/bradley/xv/lib/. This is because these programs must remain independent of each other... removing firefox should not break xv. A hard link is probably best... but a direct copy could be done as well.
Programs installed in this manner need to be independent of each other and also cleanly removable. Which means interdependent libraries need to be copied or cleaned up neatly.
Note: installing /opt/png/lib/ in this case would be wrong because the removal of /opt/mozilla/firefox will leave it behind.
And so on. This is why /opt is considered sub-optimal. But it does work well enough that it's been around for a while for that reason.
Right, that's the point. But your suggestion of having a program which calls apt-get or rpm or whatever means they have one installer but have to duplicate their efforts to make a release for each package manager anyway. It needs to work outside the existing package managers and, at best, know how to register itself with them after the fact.
I proposed an abstraction layer to deal with that, on any distro the installer would _allways_ use unipackage-check and unipackage-install, and unipackage would use apt-get or yum to do the actual checking or installing, allowing the same binary to run on all systems.
Essentually what I was proposing is the useage of /opt (or something in a users home dir) much like the origonal way, but with support for the installer to call the package manager to check dependencies and only create /opt/firefox/lib/libpng if it wasn't avalible in the repositories. Rather than just not avalible on the current system.
You're point about registering packages with the package manager after installation is an important one I can't believe I missed it.
Opens up quite a few new issues, like the fact root could delete a pacakge a user's program depends on.
It could probobly be solved by creating a dummy package and "installing" that so dependencies are tracked. [edit, I realsed a problem in my earlyer dummy package idea so I cut out the offending part]
Quote:
Originally Posted by frob23
This system would also need to know if a library it needs is part of another program installed this way (like libpng in the previous example). So if xv was installed in /opt/bradley/xv, and needs libpng, it would need to make sure that it had it's own copy of libpng in /opt/bradley/xv/lib/. This is because these programs must remain independent of each other... removing firefox should not break xv. A hard link is probably best... but a direct copy could be done as well.
I don't think so. If you don't do this than a third party program will either get the libery from the system, or use a special modified libery thats stripped of uneccacary code. (I think that can be done easily). If the /opt programs depend on each others libery then they will need to have full sized liberys every time. Plus its more complex and introduces more room for problems
Well, ideally things in /opt are supposed to be statically linked to avoid interdepending library issues. But this is never the case in the real world. I'm not trying to be negative here... but fleshing out the limitations is a way to understand the requirements.
If we use apt-get (for example) on a system to pull down all the depends then we need to be able to mark these depends as required by our 3rd party package. This is for two reasons. We don't want to delete a dependency (thinking it's unused) and break the program. But we also want to know what dependencies to check for removal if we want to remove a 3rd party application cleanly. Strictly, by the standards for /opt, all the program's required things should be under it's directory structure. This is overkill once you start using more than a few applications in this manner. So I don't propose doing that. But we do need a way to safely cleanup all the things a given package might install.
There is also the issue of package managers which don't track dependencies. They do exist in Linux. Or systems which don't have package managers. This program needs to work the same way in those as well. And each of these conditions increases the overall size of every program installed with this because the installer needs to be static and not depend on anything existing in the base, so all the possibilities need to be programmed for.
If we use apt-get (for example) on a system to pull down all the depends then we need to be able to mark these depends as required by our 3rd party package. This is for two reasons. We don't want to delete a dependency (thinking it's unused) and break the program. But we also want to know what dependencies to check for removal if we want to remove a 3rd party application cleanly.
I proposed a solution in my prevous post, all you need to do is create a dummy package with all the third party app's dependencies. Then "install" that package. The package will have to be automatically "uninstalled" when the third party app is deleated. Posibly by putting one tell-tell file in the /opt/foobar/ dir, if the file exists then the program is still installed. (if acidently deleated just make another file, any kind)
As for what happens when root deletes the dependecny anyway, the dummy package should drop the dependency from its list but remain installed (this probobly means a patch to the package managers) and when the user next tries to run the program they are told what libarys are missing, the libarys can be obtained from the origonal installer, this would need third party patches to offer a file by file update to bring new libaries up to speed
Quote:
Originally Posted by frob23
Strictly, by the standards for /opt, all the program's required things should be under it's directory structure. This is overkill once you start using more than a few applications in this manner. So I don't propose doing that. But we do need a way to safely cleanup all the things a given package might install.
I definately think that apart from the dummy package everything should be in its /opt subdir, and the installer should enforce this. Symlinks pointing to the wider filesystem are acceptible. Thus deleteing that one directory will cleanly uninstall.
Quote:
Originally Posted by frob23
There is also the issue of package managers which don't track dependencies. They do exist in Linux. Or systems which don't have package managers. This program needs to work the same way in those as well. And each of these conditions increases the overall size of every program installed with this because the installer needs to be static and not depend on anything existing in the base, so all the possibilities need to be programmed for.
Not a problem if someone uses a linux with no dependency resolution then they can manually do it for third party apps as well. All the installer needs is an ability to manually select what dependencies should be private shared liberies and what ones shouldn't
Yeah, I'm aware of pseudo packages (the BSDs have that ability to track third party applications and I've used it personally before).
Symlinks are a touchy issue. Hard links are ideal because they are the most robust on a single partition. But they fail across different filesystems and also they defeat the purpose of package management (if you remove the package and its files are still hanging around). They work within /opt because they're not registered and they are copied for every single package which needs it. And also, /opt is like to remain on one filesystem. But linking to things outside of /opt is going to bring lots of problems with it.
Also, there are many things package managers have to do which you are ignoring here. I know you don't want them to be able to edit things outside /opt but they generally have to do that. If you add a shell, for example, it should automatically edit /etc/shells. And there are library issues (ldconfig) and other things which the package manager has to take care of.
Distribution: multi booting whatever I feel like. Grub rocks!
Posts: 85
Rep:
Couldn't some kind of method of dependency "flagging" be implemented? Every program that needs this certain thing could leave it's mark on it and upon uninstall, remove that mark?
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.