Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I wanted to know if there is any issue in what I am doing.
I compiled the latest version of Python 2.7.3 on CentOS 5.9 and its running properly.
I compiled it at :
/usr/local/python
If I copy this folder to a ubuntu server as well, it works without any errors.
I have been trying to figure out what is the issue in this ?
Could it be unstable ?
No, there SHOULD not be any. Linux, unlike the "other one" keeps programs well isolated. As soon as you compile the Python code, copy the output (the folder) into the other system, then try to run it from the console to see what is happening:
Quote:
/folder/to/python --version
if you get a version number, it should be okay.
If you want to avoid having to type the whole path, there are two things you could do:
- put it in the path or
- put it anywhere and make a symlinkt to the folder in the /usr/bin folder
Yes, there may be issues. It is very difficult to assess whether there will be or not. The compiler toolchain is generally built with knowledge of the standard C library (uses headers from the standard C library to build the toolchain). As such, there may be differences in calls to the standard C library. Cross compiling may fail if there are shared object libraries on the build host that are missing or different on the target host. You can use ldd on the runtime object code to see what shard object libraries it expects to find, and use that information to assess that level of compatibility. Obviously, the general target architecture must be compatible, in terms of binary object code compatibility.
Personally, I'm almost never comfortable depending on binary compatibility for any code that I really care about (production-class code). Having said that, I can't say I've had significant problems because of it.
Since you are talking about Python, I just want to be clear about the compatibility levels you're talking about. It sounds like you've cross-built the Python interpreter, and are using it on the target host. There is potential for some things to not work in that scenario. If you are talking about porting Python source files between hosts that have a correctly installed interpreter, then you should be perfectly fine as long as the Python source code doesn't contain any version-specific code.
...hmm, sure...but it will (should) not destabelise the system...
Quote:
Personally, I'm almost never comfortable depending on binary compatibility for any code that I really care about (production-class code)
Personally, I'm always scared to do what OP wants, but...if it does'nt work, and none of the target's structure has been changed (dependencies and such)...it should be a scenario of "does or does NOT work"...
It's a bit like Java installs really: unzip somewhere and call the Java directly...not really optimised for any target, but it could work...
Of course, OP did not mention just WHY he wants to do just that...
>> It's a bit like Java installs really: unzip somewhere and call the Java directly...not really optimised for any target, but it could work...
It does work.
>> Personally, I'm always scared to do what OP wants
Well I want to do this to save time.
I have to run this software I am compiling (like PHP, Python, APache) across Enterprise Linux 5 and 6 (Redhat styles) and Ubuntu LTS.
Now I will have to compile for both arch 32 and 64 bit. So its always good to save time if its possible by not compiling these software everywhere and only have them compiled on one server like CentOS 5 and distribute it everywhere.
But stability is important and hence I seek your advice.
Will compiling on CentOS 5 and running on other distros be stable ?
If the code is built against identical or at least similar versions of all libraries, it is quite probable that it will work. Check especially the standard C library, glibc. The problem with the distros you are using is that they are at almost opposite ends of the spectrum in terms of versioning. Redhat based distros are very conservative and favor older versions of most things. Ubuntu tends more toward the new and modern. Whatever potential problems exist may be extremely subtle, and found only after significant testing. A quick test just to see if the code runs is unlikely to reveal any problem. There are a fair number of applications out there that are distributed as binary-only 'one-size-fits-all' object code, which does seem to work okay, so that seems to bode well. On the other hand, there are also many packages distributed as binary-only code but with a number of distro-specific versions, so I conclude that the vendors have identified differences that matter.
Previous articles in this thread seem to have suggested that running code built on another architecture may be harmful to the target host in some way. I find this improbable, unless the code is running in kernel space.
Thank you @theNbomr that seems a really good logic.
As a whole I am now building the dependecies as well and then distributing it.
So, only glibc and certain other libraries are dependent.
From what I understand this can be an issue if a function call e.g. in glibc was changed or removed in later versions of Glibc on ubuntu or other distros could cause crashes.
So there is a small risk, but I will test it.
If anyone knows anything more please do share as I would like to hear it from everyone experienced on this.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.