LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Slackware (http://www.linuxquestions.org/questions/slackware-14/)
-   -   Updating Non-Stock Packages After a Slackware Release (http://www.linuxquestions.org/questions/slackware-14/updating-non-stock-packages-after-a-slackware-release-689237/)

Woodsman 12-09-2008 01:00 AM

Updating Non-Stock Packages After a Slackware Release
 
Are there any guidelines or rules of thumb to help determine which non-stock packages should be recompiled after updating to a new Slackware release?

Some are obvious, such as virtual machine software or video driver kernel modules, or things just break. But otherwise?

Thanks again.

zux 12-09-2008 02:23 AM

it depends on what the software it is, most will work, but it's best to keep slackbuilds with the source and recompile if you need to (I have slackbuilds for all the packages i built and installed). But the only true way is to read the slackware changelog and to know exactly what that package uses in what way.

Alien Bob 12-09-2008 02:46 AM

The packages you build using SlackBuild scripts from slackbuilds.org, Robby Workman or written by me, all use "tags" which let you identify them easily. The same is true for the packages you can download from slacky.eu and other repositories.
For instance, my packages all have the tag "alien". You can easily get a listing of my packages you have installed, by running
Code:

ls /var/log/packages | grep "alien"
On my old box here, that has the following output:
Code:

clamav-0.94.2-i486-1alien
dansguardian-2.9.9.4-i486-1alien
par-1.52-i486-1alien
pcre-7.6-i486-1alien
terminus-font-4.26-noarch-1alien
tinyproxy-1.6.3-i486-1alien
xcowsay-1.1-i486-1alien

It becomes harder of course, when you installed software using configure; make; make install.

Eric

GazL 12-09-2008 05:55 AM

I fall into the 'recompile it all' camp, myself. The entire Posix/UNIX philosophy seems to revolve around compatibility of source. Old binaries are pretty much an unknown quantity and they may work and they may not, or worse yet, they may work but with obscure bugs introduced due to some minor change in a library or header. It just seems much safer to my mind to recompile the lot. Ofcourse, recompiling with the new tool chain could potentially also introduce obscure bugs, but to my mind keeping everything consistent is worth the risk.

I was recently working on a patch for the initrd-tree that I've sent to Pat. To incorporate this properly into my system I need to re-run the mkinitrd.Slackbuild, but I've found that the version of busybox that current contains won't build with the 2.6.27 Kernel headers (a change to netfilter.h, which now appears to require the programs to also #include <netinet/in.h>). Using the latest busybox sources instead resolved the problem for me. Now, busybox is statically linked so running the old version of the package shouldn't hit any library incompatibilities, but if any of the system calls have changed then there could potentially be problems.

Owing to the way the Slackware team works, some old packages are carried through to each new release from the previous one. Pat is usually very good at keeping things working, so I trust his judgement on what needs recompiling for a new release and what doesn't, however, its not unusual to find the odd part of the source tree that fails to build after a new release. Though all sources are available, you can't actually recreate a specific slackware release from that releases source tree alone as there may be parts that require a much older release in order to build correctly. I've always found this disconnect between the packages tree and the source tree a little disconcerting.

Personally, I'd much prefer all the packages for each release to be built fresh. When I first discovered that Pat doesn't actually do this I was quite surprised. I'd always had it in my mind that he'd have one big Slackware.Slackbuild script sitting above all the other slackbuilds, that he'd run to build a full set of new release packages from source.

Anyway, no criticism intended, just expressing my viewpoint.

rkelsen 12-09-2008 06:12 AM

Quote:

Originally Posted by GazL (Post 3368971)
Old binaries are pretty much an unknown quantity and they may work and they may not, or worse yet, they may work but with obscure bugs introduced due to some minor change in a library or header.

In my experience, it's generally only the "system-level" software which may cause problems.

I'm still running some software which was compiled on Slackware 10.0. It works perfectly. There are packages in Slackware which are even older than that.

jong357 12-09-2008 06:39 AM

Quote:

Originally Posted by GazL (Post 3368971)
I fall into the 'recompile it all' camp, myself. The entire Posix/UNIX philosophy seems to revolve around compatibility of source. Old binaries are pretty much an unknown quantity and they may work and they may not, or worse yet, they may work but with obscure bugs introduced due to some minor change in a library or header. It just seems much safer to my mind to recompile the lot. Ofcourse, recompiling with the new tool chain could potentially also introduce obscure bugs, but to my mind keeping everything consistent is worth the risk.

<...>

Owing to the way the Slackware team works, some old packages are carried through to each new release from the previous one. Pat is usually very good at keeping things working, so I trust his judgement on what needs recompiling for a new release and what doesn't, however, its not unusual to find the odd part of the source tree that fails to build after a new release. Though all sources are available, you can't actually recreate a specific slackware release from that releases source tree alone as there may be parts that require a much older release in order to build correctly. I've always found this disconnect between the packages tree and the source tree a little disconcerting.

Personally, I'd much prefer all the packages for each release to be built fresh. When I first discovered that Pat doesn't actually do this I was quite surprised. I'd always had it in my mind that he'd have one big Slackware.Slackbuild script sitting above all the other slackbuilds, that he'd run to build a full set of new release packages from source.

Anyway, no criticism intended, just expressing my viewpoint.

I feel the same way. It's initially more work up front but you don't have to monitor the vitals of numerous packages to make sure everything still jives. Never been a big fan of rolling releases just because you really have to be on your game to make sure everything remains stable.

My hats off to Pat for releasing the way he does. Too much work for me tho.... ;)

Woodsman 12-09-2008 06:04 PM

I know Pat does not recompile every package with each release. From experience I knew that system level packages directly tied to the kernel or tool chain have to be recompiled.

I had guessed that recompiling all non-stock packages would be safe, but I also guessed that some system level packages might not recompile with the new kernel headers. I have bumped my head a few times with the transition to 2.6.27.7 while testing Current. In that latter case, the prudent approach is don't update to the newest Slackware release immediately --- wait for things to cool and then most of the sources for non-stock packages should have been updated to support the new kernel and tool chain.

I suppose the safe route is when unsure recompile. Yet I don't I want to recompile 130+ packages, therefore I'll take the lazy route and watch for breakage before recompiling.


All times are GMT -5. The time now is 08:10 AM.