These have mostly been answered before, but I guess I can give it a shot.
Quote:
Originally Posted by Jorophose
First, do you think it's worth using AMD64 at the moment? (Include EM64T or whatever Intel has if they're close to each other) Are most drivers working under AMD64? What about Flash & Java & other things like that? Can most of my applications work under 64bit? (For proprietary applications, can they be run in 32-bit emulation, without huge performance hit?)
|
Personally, I think it’s worth using x86-64 at the moment. The vast majority of open source drivers work under x86-64. Even most proprietary drivers (for linux and open source OS) work. Adobe Flash is a proprietary application which coincides with the next point*. “Proprietary applications” run just fine in 32-bit mode. Notice that x86-64 architectures have both a 32-bit and a 64-bit mode. So if you run, for example, 32-bit windows on a x86-64 bit machine (as many new computer owners do), you are running the processor in 32-bit mode. There is no emulation in the true sense, as the processor itself runs 32-bit software with 32-bit registers and 32-bit instructions.
Quote:
Originally Posted by Jorophose
Second, how do I compile software (& my OS), and use .tar.gz/.bz archives to install software? I've heard that compiling software also takes up more disk space than precompiled, is that true?
|
The vast majority of open source software is distributed in “tarballs” and uses the autotools system for “portability”. With these packages, all one needs to do is untar/zip the source, run the “configure” script that comes with the package (i.e., “./configure”), run make to build (i.e., “make”), and run make to install (i.e., “make install”). Of course there are ways to customizing the build process (such as compiling with debug symbols or using certain CFLAGS, which mainly involve changing the circumstances used when running the configure script). Notice that not all source code uses autotools. Lately more and more projects are using other systems such as SCons, cmake, ant, etc. As for the space, you will take up only as much space as you want (for example, do you want/need to keep the tarball and/or the build tree?). Software that you compile yourself takes more or less space than precompiled software (after all, such precompiled binaries were compiled in the exact same manner as you self compilation).
Quote:
Originally Posted by Jorophose
Third, the "UNIX appocalypse": Is the only way to avoid it to go for a 64-bit OS? Or will there be ways to hack legacy Linux/UNIX/BSD 32-bit machines to continue working? I have a few 32-bit machines, and if I can reset the clock (Or extend the amount of time they're usable) that would be great. I think the end of 32-bit comes at like 2030 or something. Which sucks, but I guess the same thing happened during Y2K, and by then 64-bit machines will be like Pentium 2s/3s. (Given away for free or for like 30$)
|
No one can predict the future, but nothing will stop 32-bit machines from working just because they don’t know the correct date. IMHO, however, by that time most everyone will use 64-bit processors or there will be some sort of kludge resulting in an ABI change in which time_t is changed to accommodate large numbers on 32-bit processors.
*There is a slight problem using plugins such as Adobe Flash (i.e., dynamically loading 32-bit shared libraries in a 64-bit browser). These can be overcome with sufficient wrapper libraries (such as
nspluginwrapper for mozilla browsers).