LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 12-27-2005, 05:53 PM   #1
caminoix
Member
 
Registered: Mar 2005
Location: cracow, poland
Distribution: Linnex, Gentoo, KateOS, PLD
Posts: 53

Rep: Reputation: 15
why so much ado about packet management?


hello

i can't understand where all this ado about packet management comes from.
i can understand people moan about rpms and their dependencies trap and dispersion, about sources and the need of compiling everything and so on. but there are ports or apt-get for rpms, there is the portage, there is the alien and many other programs.
after all installing is just copying the right files in the right places, sometimes preceded with compilation, isn't it?
why can't one distribution support many packet management methods?

could you please explain it to me or tell me where i could find the answer(s)?
 
Old 12-27-2005, 07:25 PM   #2
Artanicus
Member
 
Registered: Jan 2005
Location: Finland
Distribution: Ubuntu, Debian, Gentoo, Slackware
Posts: 827

Rep: Reputation: 31
Technicly theres nothing preventing a distribution from supporting multiple packaging systems.

My slackware system has rpm installed, but I cannot install anything with it.. Why? Because it doesn't see my installed software. Most packaging systems have been designed to be The One and True System(tm) so they rely on their own databases for checking what is installed and what is not. So, if I try to install any rpm, it will balantly tell me that I dont have glibx installed, or gtk, or anything that it might need that time. If you try to force it it eventually refuses when it cannot find that it itself is installed.. (:

So, it could be done if there was a unified way of storing the information of installed software. So that when you compile some library, the package management tools will find it even though they didn't install it.
The other solution is to ditch dependencies. Sometimes its a good idea, slackware packages are afterall just tar.gz packages which conform to the standard slackware directory structure, so in theory they can be installed succesfully on any linux machine of the same architecture. The same goes for rpm's tho their weird cpio formar makes it a tad tougher, but rpm2tgz solves that. And alien is good at conversions also.. (:

The fundamental problem is that package management systems dont want to co-exist, they want to take over the world.. So, fix the authors first, then fix the software.. We'll get there some day.. (: But first alot of standards need to be layed out:
- directory structure (were almost there)
- installation of non-standard software (a battle between /opt, /usr/ and /usr/local)
- installed packages db (I love the slackware /var/log/packages/ way, its the easiest to interface with, but there are also other good candidates, rpm is horrible tho..)
- skills, there are alot of software and and packages around that are just plain lousy. Hard to address but still a problem nevertheless..

Just my thoughts.. Feel free to disagree, tho with explenations ofcourse.. (;

Last edited by Artanicus; 12-27-2005 at 07:28 PM.
 
Old 12-27-2005, 08:26 PM   #3
reddazz
LQ Guru
 
Registered: Nov 2003
Location: N. E. England
Distribution: Fedora, CentOS, Debian
Posts: 16,298

Rep: Reputation: 77
There is already a tool called smart that supports multiple package formats. Mandriva, Suse and a few other distros already ship it, but its still under heavy development so it may not work right.
 
Old 12-28-2005, 06:31 AM   #4
caminoix
Member
 
Registered: Mar 2005
Location: cracow, poland
Distribution: Linnex, Gentoo, KateOS, PLD
Posts: 53

Original Poster
Rep: Reputation: 15
Artanicus:
thanks for the answer
but it seems that i still have a few doubts:

am i right in understanding that the whole problem results from the fact that software is being distributed in a "pure" vesrion, i mean without all the required dependencies?

what does it look like in windows and especially mac? does win and mac software come with all the libraries so that you have the same library installed a couple of times?

why do package management systems rely on their databases instead of checking what can be found on the disk? or just rebuild their databases basing on a disk search?

why can't iso, img or the like be used the way dmg files work on macs?


reddazz:
i fear that history is in most cases very severe for those trying to unify anything. i mean, when you have two options and someone tries to do something in the middle so that the two can unite, he usually ends up promoting the third way, with which none of the two could agree
i reckon Artanicus is very right when saying that people need to be fixed in the first place...
but thanks for the tip, anyway
 
Old 12-28-2005, 07:30 AM   #5
Artanicus
Member
 
Registered: Jan 2005
Location: Finland
Distribution: Ubuntu, Debian, Gentoo, Slackware
Posts: 827

Rep: Reputation: 31
Well, the distribution problem is another one.. Maybe even a tougher one.. Windows and Mac software usually have most of the libraries builtin with the binaries or delivered with the software. So, its a whole diferent hell when program A needs version n of some lib and prog B is incompatible with version n and needs n+1 to work.. The argument isn't solved, the latterly installed will overwrite the dll. So, thats not a problem free idea either.

The mac way I think is the smartest. Usually the dmg has all it needs right inside it. Everything stays in a neat package. But it does consume extra diskspace having multiple copies of some lib that is widely needed.. Im not that familiar with their ways so can't say if the problem is that big..

The GNU folks must have loathed the idea of everyone delivering their own stuff.. Mainly because back then I guess compiling was the most common way of getting things done, and im sure no one wants to take an extra hour when you compile those dozen libs that came with the package.. So, imho its alot smarter to have the libs as seperate packages.


There have been attempts into better package managing systems. Klik uses images in the same way as dmg:s are, youve prolly seen their "klik to use, no install required"-shoutings.. (; Its a good idea, but again very poorly implemented. The idea isn't that complicated but yet they managed to make it voodoo. Its madec so that if something goes wrong, its impossible to fix.. "We got an error" Klik will tell you.. d:

Another attempt there was also, tho the name ive forgotten, to index what is already installed by simply going thru the filesystem with find. I havn't tried it myself, but from what ive heard it was the only good idea in the implementation and even that was done poorly.
The disksearch is a good idea, but not good enough.. It takes too much time and power.. Maybe if they used the slocatedb it might be worth it.. No use going around the filesystem twice as night instead of just once.

IMHO we don't need a single more packagemanaging solution. The three bigones are pretty good already: tgz, deb and rpm.. In that order.. So, if we could build an app that would integrate those three, and keep them syncronized, we'd have a winner. What im talking about is a seperate appdb that would keep the real package database as xml files and based on those upon changes feed the info to /var/log/packages/packagename-version, to apt:s package database whereever it is, and the same for rpm:s database. The last problem would be source installs, but imho checkinstall removes that trouble. So, now if we compile a lib from source, checkinstall it into a package of our choosing and install it lets say as a tgz, then find a few games, one as a generic-rpm and one as a deb, they could all find eachother in the packagedb and live happily everafter. The generic-rpm is an important note tho. We have two choices:
- Either the install localtions are standardized
- or they are made relative and power given to ldconfig and their ilk.
So that the library and binaries and includes are in places where the various PATH elements find them without tricks..

And interesting subject indeed, probably much discussed already but since nothing good is happening out if it, I feel no remorse of reopening such a discussion.. Surely my ideas have been said before, but ive created these ones on my own without any prior ideas.. None that I know of.. d:
 
Old 12-28-2005, 04:54 PM   #6
caminoix
Member
 
Registered: Mar 2005
Location: cracow, poland
Distribution: Linnex, Gentoo, KateOS, PLD
Posts: 53

Original Poster
Rep: Reputation: 15
i've just found this thing called autopackage. it seemed very interesting at first glance but later i found out it doesn't really work the way it should anyway, their faq section gives lots of very interesting information (see especially the mac part).
btw, it's a funny thing: i've always sort of moaned about how much work it was to install anything on linux (moved from windows not so long ago) but now when i've found an autopackage not work, i got pretty angry at how stupid this system is because it simply says it can't do something and i just can't help it anyhow

klik just doesn't work. you're very right about that.

paths to all libraries are stored in some environment variables, aren't they? now if you just added a 3rd point to the standard "./configure && make install": "update the variable", wouldn't it do the trick? than you could have software distributed in a dmg-like format or simply as tgz's, but with a built version included. i believe tgz's are even better for, being folders, they could have their own icons and index.html files - so that you'd have the ease of use as seen on macs on one hand, and the possibility to change everything, recompile the source and so on on the other.
or am i just much mistaken?

it wouldn't a be yet another solution - which you say you don't want (though i still believe an attempt at unification can only lead to an even greater dispersion) - it would simply be a slightly modified tgz (see for example the tgzex format as used by kate os) but with a much greater ease of use than that of slack's.

please, tell me what you think of it.

ps. i don't really believe i'm the first one to come up with these ideas, too
 
Old 12-30-2005, 09:53 AM   #7
sundialsvcs
LQ Guru
 
Registered: Feb 2004
Location: SE Tennessee, USA
Distribution: Gentoo, LFS
Posts: 10,659
Blog Entries: 4

Rep: Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941Reputation: 3941
Package managers are designed to provide, not only easy installation or de-installation of software, but also the management of dependencies. There is a lot more complexity in the Linux environment than what you may encounter in either Windows or OS/X ... more possibilities that the package designer must consider ... but you can still "keep it simple."

The first thing is: choose one package-manager system, and stick with it. Only in this way can the dependency-information be kept straight.

Second, if you do install your own stuff, put it in /usr/local.... You can even put libraries in that subdirectory, which can be made to supersede whatever is located elsewhere. (That can, of course, be a two-edged sword.) Don't interfere with what the package manager is trying to keep straight for you.

One reason why Windows and/or OS/X may seem to be a nirvana is simply that those users rarely "step outside the box." When they do, I assure you from experience, things can get very hairy. There's a lot of commercial software out there whose installation procedures are far too invasive, and which were clearly "barely tested" before ship-date. (I think that many companies celebrate "code complete" with wild celebration, then do the installer as a last-minute afterthought. In this respect, Linux package-designers usually do a much better job.)
 
Old 01-06-2006, 01:31 PM   #8
caminoix
Member
 
Registered: Mar 2005
Location: cracow, poland
Distribution: Linnex, Gentoo, KateOS, PLD
Posts: 53

Original Poster
Rep: Reputation: 15
you've partly hit my problem. i want to stick with one package manager. the thing is that hardly any gives me everything i need so i end up having to use other means of installing things. i'm now checking out various package managers.
so far, sorcery (sourcemage) and perhaps portage seem the best to me. although compiling kde takes quite a time...

oh, you don't need to persuade me windows installers suck as to mac os, i have to admit i've had very little experience with it (just pearpc for some time).

anyway, my point is that i don't like it when programs are scattered all over the disk. you know, some binaries in /bin, some in /sbin, some in /usr/bin, /usr/sbin and the same with /usr/local, plus /opt; sources elsewhere; libraries in /lib32, /lib64, /usr/local/lib and lots of other places, too; config files in /etc, ~/.kde/share/apps/..., /usr/local/kde/share and god knows where else.
so, what i would love to see is a new way of arranging this, perhaps in a more mac os-like fashion - so that most files are simply in one directory while just the shared libraries are put into some separate folder.
well, putting all system directories into something like /sys would imho be much more elegant, too.

to make it short: i don't want any windows-like installers. i'd like to have a portage-like portage manager with a mac os-like grouping and generally a more windows-like layout of the root directory (/home, /lib, /prog and /sys, perhaps).
is linux configurable into that without the skills of a kernel programmer?

Last edited by caminoix; 01-14-2006 at 11:34 AM.
 
Old 01-06-2006, 08:08 PM   #9
reddazz
LQ Guru
 
Registered: Nov 2003
Location: N. E. England
Distribution: Fedora, CentOS, Debian
Posts: 16,298

Rep: Reputation: 77
There is logic about where programs are placed in Linux and Unix systems so I am afraid installing all programs into one directory just won't happen. The issue about package managers will never be resolved because opensource serves a lot of people with different preferences. If the distros standardise on one package management system, someone else will create a distro with something different. Thats both the beauty and evil of opensource.
 
Old 01-07-2006, 03:20 AM   #10
Artanicus
Member
 
Registered: Jan 2005
Location: Finland
Distribution: Ubuntu, Debian, Gentoo, Slackware
Posts: 827

Rep: Reputation: 31
reddazz, that is very true, but can you deny that a unifier program operating to synchronize packagemanagers wouldn't be possible? Or useful? Then the people wanting to go their own way could do so in peace and those who want a well integrated package structure could keep all installed software in a nice order..

Id code it myself but severely lack the skills.. Maybe someday. First figure out the kinks of the three biggest PM's databases, then build a framework and if someone want to, a fancy GUI tho I don't see the need for one more, especially in an app whose point is to stay out of sight and make things just work like theyre supposed to.. (:

Any volunteers? (;
 
Old 01-07-2006, 03:20 AM   #11
caminoix
Member
 
Registered: Mar 2005
Location: cracow, poland
Distribution: Linnex, Gentoo, KateOS, PLD
Posts: 53

Original Poster
Rep: Reputation: 15
redazz:
i'd say, there are at least three separate logics (/usr/local, kde and mozilla), and none of them is really comfortable. so far, i still have my hope...

as to an absolutely standardized package management, i suppose you're very right. but i'm not a great enemy of creating yet a new way to get the same thing done, especially if it can get done better that way.

Artanicus:
i'm by far not a real programmer. however, am i right in thinking that all it'd need to do would be to update the database of say, apt-get with the data from say, apt-rpm? methinks, it shouldn't be very hard...

Last edited by caminoix; 01-07-2006 at 03:23 AM.
 
Old 01-07-2006, 03:38 AM   #12
Artanicus
Member
 
Registered: Jan 2005
Location: Finland
Distribution: Ubuntu, Debian, Gentoo, Slackware
Posts: 827

Rep: Reputation: 31
As to the locations of stuff, IMHO theyre rather logical.. Not to say that there wouldn't be any mammoths from the good old times as there are several, but its fairly logical after all.

/bin the most essential binaries the system needs to boot up and also the users need

/sbin the same except for the superuser (hence the s)

/usr/bin the bulk of the binaries, all which are useful and accessible to users

/usr/sbin the same except again for the superuser

/usr/local/bin the place where your own compiled versions should go. Thus by manipulating the order and contents of PATH one could choose whether to use the "official" versions directly under /usr or the customized versions. Or thats how ive used the separation anyways.. (;

And the same applies for libs and includes. Where it gets strange is some of the people who think their software should be "foldered" .. Then we start seeing stuff like /usr/local/somesoftware/bin/somesoftware and thats just plain wrong... Ofcourse it can be temp-fixed by liking the binary to a standard location.. And a totally nother thing is this new /opt craze.. Sure I see the point of putting there stuff not in the core of things and optional stuff, some even recomend prefixing all your own compiles there which would have been smart if I had known that from the beginning.. (; But essentially its just a fragmentation into a very good and logical system layout..

/var and such have their own place, and I myself always create my own /share where I keep documents and the like that need to be accessed by the users, the likes which everyone needs like music and such.. (:

Ofcourse this doesnt mean there couldn't be a /dis (for disambiguation) that would keep a cleaner structure of things for those who want to see it differently.. Perhaps like this:

Code:
mkdir -p /dis/sys/include
mkdir /dis/sys/lib
mkdir /dis/prog

ln -s /bin/* /usr/bin/* /usr/local/bin/* /dis/prog/
ln -s /lib/* /usr/lib/* /usr/local/lib/* /dis/sys/lib/
ln -s /usr/include/* /usr/local/include/* /dis/sys/include/
ln -s /home /dis/
Or something like that.. Feel free to test, its pretty riskless as its all just symlinks.. (:
 
Old 01-07-2006, 03:41 AM   #13
Artanicus
Member
 
Registered: Jan 2005
Location: Finland
Distribution: Ubuntu, Debian, Gentoo, Slackware
Posts: 827

Rep: Reputation: 31
Quote:
Originally Posted by caminoix
redazz:
Artanicus:
i'm by far not a real programmer. however, am i right in thinking that all it'd need to do would be to update the database of say, apt-get with the data from say, apt-rpm? methinks, it shouldn't be very hard...
That would be the bulk of it, yes. But it also needs to poll the databases somehow so that you wont have to wait for a nightly update or manually update when youre on an installfest.. (;

So, perhaps adding wrappers around the PM bins to run the update after a succesful install.. A temptfix, but better than nothing..
 
Old 01-07-2006, 08:37 AM   #14
caminoix
Member
 
Registered: Mar 2005
Location: cracow, poland
Distribution: Linnex, Gentoo, KateOS, PLD
Posts: 53

Original Poster
Rep: Reputation: 15
perhaps i'm simplifying things too much but do you think the following could work:
let's say you call the unified manager uniapt. then by typing, say,
uniapt -i a_new_great_piece_of_software.deb,
the script runs apt-get -i a_new_great_piece_of_software.deb && uniapt --update-databases.
typing
uniapt -i sth.rpm
would get you rpm -i sth.rpm && uniapt --update-databeses, finally typing
uniapt -i sth.tar.gz
would get you
tar -zxvf sth.tar.gz && cd sth && ./configure && make make && make install && uniapt --update-databases.
 
Old 01-08-2006, 06:56 AM   #15
Artanicus
Member
 
Registered: Jan 2005
Location: Finland
Distribution: Ubuntu, Debian, Gentoo, Slackware
Posts: 827

Rep: Reputation: 31
That is one wayc of doing things, yes.. And its so simple to make that it would make a perfect addon. BUT the first priority should be making seamless unification.. So, that when those fancy tools that come with most distros install something, the whole system knows about it.

So, id rather do this:
Code:
rpm-real $* && uniapt --update-db
And just place that instead of the rpm binary..
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Packet Filter to redirect a packet to a user level process akawale Linux - Networking 3 09-01-2006 12:06 PM
how do i read the data in the packet that i have captured after packet capture? gajaykrishnan Programming 23 04-19-2006 05:09 AM
need help modifying php.ini file for ado connectivity marvc Linux - Software 3 02-16-2006 02:32 AM
packet fragmentation in packet forwarding code cranium2004 Linux - Networking 0 05-16-2005 04:05 AM
Installing Perl DBI/ADO/Win32/OLE krzykard Linux - Software 0 11-19-2003 12:09 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 03:07 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration