Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Most modern distributions support the fuse-3g drivers what are used for reading and writing to NTFS partitions so that part of your request should be met by all the distributions... really it comes down to what you want. Ubuntu is a common choice because it's relatively easy and tries to be relatively restrictive however Fedora is a rival to Ubuntu in this field and effectively they just come from completely different branches of the Linux tree (Fedora being on the Red Hat branch and Ubuntu on the Debian as has been mentioned above). Fedora tends to be slightly more open in my opinion but slightly less stable, however both Ubuntu and Fedora sit on the edge of trying to get the latest things, which can lead to issues down the line like stability or compatibility issues. Generally with both Ubuntu and Fedora you would probably find yourself having to do updates/upgrades more often then other distributions for this reason... they are still pretty much the most desktop friendly however.
1 members found this post helpful.
Click here to see the post LQ members have rated as the most helpful post in this thread.
I used Xandros (mostly Rev 3) for a long time because a few of my clients seemed to want something that looked and worked (on the surface) like Windows. For myself, I moved to Debian as a production OS on a workstation and as the host for a "virtual server" to test distributions. I keep in practice installing and updating systems using this technology / technique. (One of the systems running in the virtual machine inventory is Xandros 4.x, but I don't use it much.) I rarely have more than 6 VMs running -- at this moment it's Solaris 10, openSUSE 11.2, CentOS release 5.5, Ubuntu 10.04.1, Fedora release 13, of about 25 in my VMWare inventory.
Many of the suggestions already posted should be useful to you. Along that line, a recently-written article that compares current systems may also be useful, as opposed to looking through the bare list of distributions:
LFS teaches people how a Linux system works internally
Building LFS teaches you about all that makes Linux tick, how things work together and depend on each other. And most importantly, how to customize it to your own tastes and needs.
For media editing and stuff you pretty much have to go cutting edge and/or install from sources. At which point distros become a bit too dated. You'll likely need to install from sources as mp3 is stripped from a lot of distros and other common codecs. Or they're just so dated that unless you're working in SD video and Stereo audio, you might find a linux distro as such ill suited (without drastic customization).
I run debian lenny, but .... I have a 4yo laptop and layer a bunch of from source things on it to make it work for my media editing needs.
Still mostly command line for anything optimal in terms of fewest conversions in video codecs and best resampling of audio. aka HQ as in High Quality needs. Lots of tools not listed. Not many distros that include all of those. Or recent enough versions to be of use. Not including system optimizations to get the most of your machine. Custom kernel, graphics driver, version of X, ..... and other things.
If you just want distro only versions of things to do that stuff. You might find yourself running multiple distros. As one favors video, the other audio, the other authoring. Which is annoying at best. If you customize (from sources) your system a bit, distros become moot. As long as the basics are recent enough you can make do. Even debian lenny is a bit too old for a lot of this stuff. But not unusable as you can from source the needed elements. Which can get a bit dicey when working with a kernel, alsa, jackd, and other things needed for the newer things just to compile.
How about ubuntu studio if you need an up-to-date multimedia distro? http://ubuntustudio.org/
Never tried I so I don't know...
Yeah, I've been studying this. It can be installed as an OS or from within Ubuntu or just live CD. Since I plan on putting everything on a 1TB HD, I think I'll have room to install studio from within Ubuntu - although I'm not sure how that would work since it's a separate OS. Does that mean I would be triple booting between Ubunto / Ubunto Studio / WinXP?
What would you do for an optimal system? If you were me?
Regarding the real way of optimization, if your system supports newer instructions like sse2 and newer, you may recompile your system to make it faster.
You can look for distros that supports recompilation of the system. Most distros only distribute binaries that are compatible with the old CPU instructions like the ones in i486 or i686 but some allows you to rebuild your system to be able to use newer ones. If you're applications can use the newer instructions they probably can go faster.
The only problem though is that recompiling a system is not an easy task and takes long time. There's also the risk that some will not work afterwards, and the risk of malfunctions since it will give great stress to your HDs during build.
Anyway it's still up to you to decide. If you think you can really gain benefit from recompiling the system then maybe you can go for it.
If I knew how to compile anything, I wouldn't be here asking for help - I'd be here giving help. I'll leave compiling to the engineers.
I've done a little research. I've found that OpenSUSE 11.3 may fit my needs the best because:
OSS provides auto-NTFS partitioning and loading of Windoze (for trogs who desire dual boot)
OSS 11.3 has kernel 2.6.34 which optimizes use of advanced format HD
OSS has repository access to Cinelerra, Ardour, and a few other multimedia editing apps I'm using now.
I looked up the method for putting Ubuntu in dual boot w/Windows. If understood the instructions exactly, then I would probably know how to compile things, then I wouldn't be here asking questions. I crave ease of installation, therefore I choose OSS 11.3.
IF I knew that I could absolutely and completely replace all windoze apps w/linux, I would do it in a heart beat. If I knew Ubuntu could replace everything I'm doing in windows AND optimize use of my new 1TB HD, I'd run away with her. Because of my experience w/Xandros, I am playing it "safe" with a dual boot.
"Safe" in a manner of speaking - in a way, I believe I've become somewhat black hearted in my use of windoze. The problem is Xandros didn't make things any better. When I go to WinXP, it's like the high maintenance "abusive wife" who turns blue and shuts down whenever she sees fit, but I'm cheating on her with an amish dominatrix in the form of Xandros who beats herself when I try to bring her into the present.
I'm gonna find a party somewhere with real people. Maybe, I'll find a sexy software engineer who can tell me what's best.
You realize, of course, I had to look up what you mean:
I'm gonna find a party somewhere with real people. Maybe, I'll find a sexy software engineer who can tell me what's best.
Sorry I was just trying not to give favor on my favorite distro as there are also some distros that are also good with recompilations. I thought that it was best to let you find the system that's best for you but perhaps I was wrong. Perhaps it would have been better if I just mentioned some of the distros I knew. Again sorry about that.
These two distros are the only ones that I know that generally support recompilation of the system. I mean most of the packages that you install in them are recompiled before use.
The first distro I tried among them was Gentoo. I tried ArchLinux once back in 2005 but never really used it and chose Gentoo as my favorite,.. but you are always free to choose either of them. I think both distros provide enough docs or howtos to guide you through the setup of you system. They also have supportive forums.
Here's some helpful links you might consider if you'll go for Gentoo.
If you consider to recompile your system, the following would be some notes for installation of Gentoo, just in case.
With Gentoo you'll basically start by downloading a stage3 build and a portage snapshot. Extract the files inside stage3 build to the root directory of a new partition. After that extract the portage snapshot inside usr/ of the same new partition.
The overall step after the copying of files (above step) would be to make some configurations in /etc and install the kernel for the new Gentoo. Once it's done, you'll be able to run Gentoo on its own (from reboot).
First we have to access new gentoo system in an isolated environment. We'll do that with chroot. This includes binding of the active directories for share of use with the gentoo system then the actual running of the chroot command.
For example if your new gentoo partition is mounted in /mnt/gentoo, you'll have to do this:
mount --bind /proc /mnt/gentoo/dev
mount --bind /proc /mnt/gentoo/proc
chroot /mnt/gentoo /bin/bash
Now we should be in the new gentoo system.
After this first we should choose the portage profile. That's done by creating a link file 'make.profile' in /etc that will point to a profile directory in /usr/portage/profiles. For starters you should only consider one of the two: /usr/portage/profiles/default/linux/<system>/10.0/desktop/kde or /usr/portage/profiles/default/linux/<system>/10.0/desktop/gnome. Where <system> is the type of system you have. For example, I choose kde and my system is x86 then I chose /usr/portage/profiles/default/linux/x86/10.0/desktop/kde
cd /etc; ln /usr/portage/profiles/default/linux/x86/10.0/desktop/kde make.profile -sf
After setting the profile, you'll have to edit /etc/make.conf. This file has many possible configurations. You can have more detail about it in the Gentoo Handbook. For common installations or just for the sake of building the kernel, it can be done with settings like this:
Note that the setting above is based on an i686 pc host. Please refer to the Gentoo Handbook for that.
Also the available editor that's provided is only nano. Editing the file can be run with 'nano /etc/make.conf'. If you want to use an editor (for example, an editor that's on your system), you can directly access the etc/make.conf from your host system. E.g. /mnt/gentoo/etc/make.conf
There should be more configurations after this that should be done but based on my experience I think it's already enough for building the kernel.
To build the kernel, you should first merge the sources.
This will download the sources and place it in /usr/src.
You can also download the helper program for building the kernel if you don't know how to compile the kernel.
I also suggest that you read some tutorials on how to compile the kernel. There should a lot of tutorials that you'll find around the web
From this, honestly I'm already tired or I already find it hard to explain. But at least it's already half through.
For any problems just refer to the handbook. It's pretty much complete. Before writing this micro guide I also even didn't think that you'll also have to setup the temporary network connection for downloading the files. It's already quite broad for me.
After compiling the kernel you should properly setup your boot loader to run the newly built kernel with proper parameters like root directory pointed to the new gentoo partition. And also edit /etc/fstab so that Gentoo will recognize the proper locations of the root filesystem and other partitions at boot.
Once you got Gentoo running on its own, you can finally recompile it with emerge -e system and/or emerge -e world. Remember that kind of optimization will depend on the value of CFLAGS and CXXFLAGS in /etc/make.conf. Try to search for pages with the keywords 'gcc optimization' for more info about optimization flags.
Btw I can't go online often and I also have other things to do so I hope you consider the limitations that people like me can help.
And again rebuilding the whole system is not an easy task and takes long so think about it first before you decide.
Last edited by konsolebox; 08-12-2010 at 04:44 AM.
Bear in mind that optimizing via compilation isn't going to double or triple your systems performance. It's merely a 5% to maybe 50% gain in speed. For a pretty high costs. But you gain a lot by having the latest versions of things and all of their abilities not present in the old versions that come with your distro. For some things where extended cpu features drastically change performance you'll notice gains. But for most things probably not. So it takes 6 seconds to boot instead of 10 seconds. If you're running a server cluster that could be significant gains. But for most desktops, not really. You could get more gains with administrative tricks like putting swap and/or /var/log and /tmp on their own storage device. And other bottleneck management techniques.
and other sites. It used to be pretty standardized.
But there's a lot of early developement software out there now that use OTHER tools. Like scons, mercurial, and just too many to count. Plus you need the right compiler, for assembly language stuff gcc comes with gas, but some sources require nasm and even yasm to build. If you don't need to become an engineer, you might find it easier to wait five years for everything to become current on your distro of choice with todays code. Which will be 5yo in 5 years.
Outside of compiling your own kernel, version of X, and media players / editors. You don't really gain much in the performance department.
What would be the benefit of compiling if OP installed a 64 bit system? I believe that 64 bit has all the instruction sets by default, not to mention that 64 bit is faster in CPU intensive tasks anyway?
There's a benefit if the newer version has been optimized for 64 bit. As in multi-threading enabled. And the version that came with their distro doesn't include that feature. Plus some things are still distributed by source only. Which requires you to compile IF you want to use that thing. Granted that outside of bug fixes, performance tweaks and enhanced feature you don't really gain much.
64 bit could actually slow you down if your application doesn't actively use any of the benefits of 64 bit like extra registers, handling of larger chunks of RAM. Slowed down in that the commands of execution are now twice as big from using full 64 bit registers, instead of the smaller byte opcodes of their 32 bit counterparts. i.e. the program is now bigger, takes longer to load, slightly longer to execute. Not that you'll notice much with faster CPUs and RAM these days. I've had to compile a few things, less so now than then. Needed alsa with sequencer support. Timidity with jackd support. Kernel with ACPI support in those early days. Things that didn't come with the distro of choice at that time. Or didn't work entirely well on my hardware at the time without the upgrade.