GeneralThis forum is for non-technical general discussion which can include both Linux and non-Linux topics. Have fun!
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
PLEASE NOTE: All LQ Rules apply to the General forum. Flame wars, personal attacks, hostility, insults and behavior of that nature will not be tolerated. Differing opinions are one of the things that make this site great, but to benefit from differing opinions the discourse must happen respectfully and thoughtfully... without insult or personal attack. Members who are unable or unwilling to participate in General under those parameters will not be permitted to do so. If you see behavior of this nature please report it.
I hate java (the programming language, not the coffee nor the island--about which I have no knowledge, and therefor no opinion).
I'm here in part to explain why, in part to figure out why. I'm NOT here to tell you that you should hate java--if you really love it, I accept that (depending on such contingecies as the existence or lack of existence of free will, it may or may not be your choice, but in any case it's my choice or lack of choice to accept that--and in fact I would like to understand why).
So, on to the good stuff: why do I hate java?
The easy answer, which only shifts the question, is that it doesn't posses the qualities that I like about programming languages. That of course raises the question: what is it I like about programming languages?
Perhaps I can best figure this out by looking at the programming languages I like the most--at the moment those are C and python--and work out the differences between them and java.
The Committee kept as a major goal to preserve the traditional spirit of C. There are many facets of the spirit of C, but the essence is a community sentiment of the underlying principles upon which the C language is based. Some of the facets of the spirit of C can be summarized in phrases like
1. Trust the programmer.
2. Don't prevent the programmer from doing what needs to be done.
3. Keep the language small and simple.
4. Provide only one way to do an operation.
5. Make it fast, even if it is not guaranteed to be portable.
The last proverb needs a little explanation. The potential for efficient code generation is one of the most important strengths of C. To help ensure that no code explosion occurs for what appears to be a very simple operation, many operations are defined to be how the target machine's hardware does it rather than by a general abstract rule.
Yes. Yes, yes, yes!
Let's see--does java succeed at this? I don't think so:
Trust the programmer: /* fallthrough */
Don't prevent the programmer from doing what needs to be done:
Well, for one, it does not trust me to figure out what the best program design is, it tells me (and of course, object orientation is better than any other paradigm for any task). Also, it trusts itself to not abuse operator overloading (a plus sign is also string concatenation), but doesn't trust me to not overuse it. Apparently operator overloading decreases the readability of programs, because of the uncertainty of what the operators do (which is why you have to write a.times(x).times(x).plus(b.times(x)).plus(c), which is infinitely more readable than a*x*x + b*x + c). Then again, the language doesn't trust the programmer to not underuse operator overloading, which is why + works for strings.
Another thing--java compilers warn you about dead code and methods without a return. True, C compilers do the same thing (and it's actually very valuable), but there's a difference: java compilers treat them as errors (-Werror), which means you have to comment out the code if (say, as part of debugging) you return early from a function.
But it gets worse than this: between upgrades, some compilers learn new control flow analysis tricks, so previously valid code becomes invalid simply because the compiler can prove (in the new version) that you do or don't return properly.
And just to add insult to injury, it doesn't know sh<beep> about functions that doesn't return:
Any "good enough" java compiler will say "ah-ah, I'm not compiling that, you need to have a return statement in that method". Pardon? I don't need a god damn return statement, I never return--System.exit never returns.
Also, of course java would never support multiple inheritance, because that can "only" be used to shoot yourself in the foot, right? *bzzt* wrong! Sometimes, multiple inheritance comes in handy for very compelling practical reasons, but then what can one do? Well, one can simulate being both classes by having one class forward all calls to another, each of which extends one of the two superclasses. Of course, this is really crufty and impractical, which defeats the purpose of practicality.
Let me just make my point stand out:
You rarely need multiple inheritance, but when you do, it REALLY hurts not to have it.
Keep the language small and simple: The core language is not too big, and not too complex: I won't call the concepts of inheritance, subclass method overriding and type-driven overloading simple and intuitive--they're in the gray area (and if you think they are simple, ask yourself if you've spent more than 30 hours using them).
However, I consider the standard library to be essentially a part of the language. Here, C beats java by far: not only can you hold a concept index of the C library in your head, you can hold (almost) the whole library API in your head (okay, I admit I can't remember whether the arguments to qsort are (pointer, nitems, itemsize, comparator) or (pointer, itemsize, nitems, comparator), but I think it's the first--which my manual page happen to agree with--and it's the same as bsearch minus the `key' argument), whereas for java, for a long time, I gave up outputting right-adjusted number in any pretty way because I didn't bother looking for something similar to printf. I know, there are javadoc documentation of the entire standard library and its API, but it's too big to browse for anything. Using google, I have found a text formatting system, but it's horribly verbose and bulky--you generate an object for each format (that is, "i = %02d, A[i] = x = %f, f(x) = %f" would require two format objects, one for "%02d" and one for "%f" being used twice).
However, what's worse is the overreliance on threads (because we all program multithreaded programs correctly, right? I do that some of the time, but not always). I heard a talk about Java I/O by Elliot Rusty Harold (from technetcast), and as far as I recall, the argument that Java didn't do asynchronous I/O is that "you can just use threads instead".
Okay, so suppose I didn't think this made the problem more complex than it had to be and hence didn't think this was an overuse of threads, let's look at what thread programming--critical region protection in particular--is like in java: there's monitors and wait sets.
What you say?
That's right, there's no mutex. There's no cond. There's no semaphore. There's no barrier. There's monitors and wait sets.
But wait--those are actually very similar to a mutex and a cond, right?
In principle, yes (well, maybe). In java, no.
Firstly, every object has one of each (monitor and wait set, that is). Yes, your linked list which is already protected by a different monitor too. Yes, the nodes in the linked list too.
Secondly, the acquisition and release of a monitor are linked together, due to the syntactic construction:
When the keyword `synchronized' is applied to a method with body foo, it's equivalent to
With the specification that when control leaves the brace-enclosed region, the monitor is released.
So, what does this imply? This implies that one can not, say, lock the monitor in one method and release it in another. Also, it makes it awkward to make the code branch and exit the monitor before the end of one branch (even if you release the monitor in every branch)--the best way I can think to do it is to throw a lot of <ClassName><MethodName>SynchronizedBlock<n>CodePath<k> extends Throwable and continue the branch in the exception handling code, but this implies jumps of control flow non-local in the file, which drags readability downwards.
But what is worse, is that whenever one needs the low-level abstractions (mutex/cond), one has to build them of high-level abstractions. Let me repeat that:
One has to build low-level abstractions from high-level abstractions.
So, apparently this has something to do with cache coherency--something I'll probably never learn enough about--as when the monitor is either entered, exited or both, the cache is ensured to remain valid.
Okay, so some people are a lot smarter than me when it comes to these things, that's okay, but as far as I can tell, incoherencies can only occur when you write, not when you read. So this raises the question: why the hell doesn't java have read/write locks instead of monitors? I mean, it's not like it's rocket science to write a read/write lock (in fact it's computer science ), and... hold that thought.
Some objects are immutable, some are mutable. The only synchronization guarantee one (probably) ever needs for immutable objects is that they aren't used before they're initialized--since when they are initialized, they don't ever change (by definition of immutability) and hence don't need synchronization (since they're never inconsistent). But of course immutable objects have monitors and wait sets even though very few of them need it.
Then there's mutable objects. Most of these are (presumably) not write-only, one both reads from and writes to them (or rather, their memory region).
... okay, release that thought. Java doesn't have read/write locks. If you agree with my assesment, probably the most useful synchronization device would be--exactly--read/write locks. Yes, of course you can build your own read/write lock class, which will of course require you to resynchronize your cache regarding your (bool,uint)-pair when someone locks the rwlock. And of course you have to resynchronize your cache when someone writes to the object that uses the rwlock.
There might be some wonderfully clever reason why my linked list needs a monitor and a wait set instead of a read/write lock, and there might be an ever more clever reason why the nodes need synchronization devices at all, but I don't get it.
Shows how stupid I am (and hence java can't trust me).
Provide only one way to do an operation: Regarding this particular point, I don't have anything bad to say about java.
Make it fast, even if it is not guaranteed to be portable:
As you recall, it was stated that one of the strengths of C is its generation of incredibly fast code. I'd like you to read the Zen of python in PEP 20: http://www.python.org/peps/pep-0020.html but I'll restate the (right now) most important proverb:
Practicality beats purity
And I think this is the generalized version of the essence of the fifth C proverb. So let's see, practicality beats purity.
I don't think the java folks ever heard (or believed) this--let me compare:
C has unrestricted use of pointers, for very easy traversal of complicated structures (fib heaps spring to mind). C is more terse: `puts' vs. `System.out.println'. Python has generator expressions and list comprehensions with `builtin' map and/or filter:
varname = [somedict[key] for key in somedict if key in someset]
Java doesn't seem to have one bit of pragmatism.
Anyways, I've just got a new keyboard in the mail, so I'll shut down my machine now. Perhaps I'll come back and edit this.
Just to make sure I get this stated clearly enough: Donald Knuth held a lecture series about CS and religion, during which he was asked what his number one wish wrt. completing TAOCP was, to which he replied (paraphrase):
"This field (CS) is moving so fast. I'd wish for more bad ideas to slow everyone down--I'd like to see more things like java."
Last edited by jonaskoelker; 02-13-2006 at 01:39 PM.
As well written as this post was, it's very, very opinion driven. I, for one, found laughable that this "Committee" believes that portability is not an important issue. I develop Java applications at home and at school and we use a lot of different OS'es. It's a bless that Java's ability to run across platforms does not stay in the way. Some students has Mac, other Linux, some Windows and we code mostly in Sun Sparc / Solaris architectures.
As we would be arguing with our own opinions ratter then facts, I won't post a long message as yours telling you why Java rocks, but I will tell this: Java has it's importance and usability just as C or python does. Use the right tool for the right job. You won't go anywhere trying to write drivers for a piece of hardware with Java as you won't be very successful writing firmware updates using C or Python. And as soon as you step in GUI-programming with C you will be crying for a "real" OOP language. Again, right tool for the right job. Don't go using your F50 to some Off-road weekend.
However, I consider the standard library to be essentially a part of the language. Here, C beats java by far: not only can you hold a concept index of the C library in your head, you can hold (almost) the whole library API in your head
I beg to differ, while a big C fan, I have to admit Java beats C in that area precisely because of its wide "standard library" vs the Ansi C slim one and the multitude of overlapping third party libraries, which impose C developers to reinvent the wheel or to struggle with randomly documented libs ...
I know, there are javadoc documentation of the entire standard library and its API, but it's too big to browse for anything. Using google, I have found a text formatting system, but it's horribly verbose and bulky--you generate an object for each format (that is, "i = %02d, A[i] = x = %f, f(x) = %f" would require two format objects, one for "%02d" and one for "%f" being used twice).
Not any more:
System.out.printf("i = %02d, A[i] = x = %f, f(x) = %f", i, a[i], f(x));
Funny thing is the reason I dislike Java most is it's too C-like. I prefer plain English. Spelling out the word AND is only one character more than && and infinitely less cryptic. Any decade old compile time justification is lost to me on my 3 GHz PC.
As well written as this post was, it's very, very opinion driven.
That's true, and I stated that at the very beginning of the post. So at least I'm not trying to pull the wool over anyone's eyes.
I, for one, found laughable that this "Committee" believes that portability is not an important issue. I develop Java applications at home and at school and we use a lot of different OS'es. It's a bless that Java's ability to run across platforms does not stay in the way. Some students has Mac, other Linux, some Windows and we code mostly in Sun Sparc / Solaris architectures.
C is fairly portable--GCC compiles to about 32 archs last time I looked. However, that is fairly vacuous--but look around at some real C projects meant to be portable: they will probably port to at least what 95% of the users want it ported to. Take for instance Simon Tatham's portable puzzle collection (GIYF): it ports to GTK (GNU/Linux), Mac OS X, Microsoft Windows and Palm OS.
<snip> Use the right tool for the right job.
Absolutely. I just think that java is never the right tool for the job. In the areas where speed or closeness to hardware is important, C wins by far. In almost other areas, it's about portability and getting the job done in fewer lines (bug count is proportional to line count, or favors brevity even more), and here I think python is the best tool in most cases. You may think it's perl, or php, or ruby, or $LANGUAGE; I just think java is never the right tool (for me at least)
[quote]you won't be very successful writing firmware updates using C or Python[quote/]
I haven't tried that, so I can't argue against your claim, but I can ask you to explain me why C and python are both inadequate for the job.
The following section is in response to jlliagre
I beg to differ, while a big C fan, I have to admit Java beats C in that area precisely because of its wide "standard library" vs the Ansi C slim one
I agree that having features is no bad thing in itself--the java library is just to overwhelming to me. Also, for the tasks where one would typically use C, there is very little I miss in the standard library. Take again Tatham's puzzles--in the collection he supplies a (2,4)-tree and a maximum flow algorithm, and that's about it.
I think the C++ standard library is just about the right size. The only fundamental thing I think is missing is graphs and a few more `garbage collecting' smart pointers, but see also my next installment.
(there is no printf) Not any more
I know about the changes to Java 1.5 (about god damn time...). However, the last time I checked (though I of course don't have an incentive to check often) no java compiler and VM which supported 1.5 respected the freedom of its users (that is, none were free software). It may just show the limitations of my knowledge, but I still hope no one falls into the java trap (as described on http://www.gnu.org/philosophy/java-trap.html).
And I also have a word for crito: yes, I prefer `and' to `&&' by far. python only has english, C++ has both, and C has #define (and I often define `and', `or' and `not' to their expected values). Java has nothing.
Last edited by jonaskoelker; 02-13-2006 at 02:15 PM.
I must say, it has been a while since I read threads from peoples who know what they are saying and has a point. I really like reading your thread and your replies, even though I don't agree with everything (otherwise, where would the fun be, ^_^).
I actually started a similar thread a while ago called "So, why not Java". I'm mostly interested in game programming and I just happen to use Java a lot. You can check the old thread here:
C is not good to write firmware because, well, it's still too big to fit in embedded systems, as far as I can tell. The language used to write firmware is called "Forth". Forth language is usually used to compile threaded code. It's heavily used by Sun and IBM for their BIOS and coincidentally, FreeBSD uses the fourth language at the first stage boot controller.
What I dislike in Java, if anything, is actually it's own portability. While portable enough, we have to wait(or beg) for Sun to port the Virtual Machine to other systems(at least officially), such as BeOS or even FreeBSD (OS that I love, by the way). Also, while appealing for me to develop games with, I'm very well awared of the limitations of Java, especially if I ever wanted to move to console programming. Gameboy Advanced uses C, for example (in a very OOP fashion way though).
So, this is the second installment in my `I hate java' article.
In this, I talk about some of the other programming languages that I like less than C and python. In particular, I plan to talk about C++, lisp/scheme, perl and bash. I'll also try to compare them to java, and explain why I would pick them over java any day of the week.
What all these languages have in common is that I can understand why someone would use them--they all maximize some characteristic: C++ has its expressiveness (that is, you can express everything *very* precisely), lisp through its macro system has its flexibility, perl is very pragmatic, and bash has the biggest `library'.
But that being said, each lacks something: C++ is too big and complex, lisp is too verbose--I want to write collection[i], not (vector-ref collection i)--perl is *too* pragmatic (that is, at the cost of maintainability), and bash is unportable.
So, let's have a closer look at each one, starting with [b[C++[/b]. I once actually really loved C++, and I think it does contain something very valuable. For one, I think templates is one great idea--type-genericity in general is a good feature. However, the syntax is horribly verbose. I think I'll confine myself to *using* template classes and functions. Of course, it adds praise to good health that there is a decent standard library utilizing templates to achieve something useful: saving the programmer from implementing vectors/lists/maps/sets over and over again (I from time to time miss slists and hashmaps in the stdlib, though, but one can use stlport).
Then there's exceptions. While useful sometimes, I both laugh and cry at the option of throwing a bool. It *does* provide a decent way to `return' a bool from a deep recursion tree, but it has to be used carefully to not shoot yourself in the foot. Then again, trust the programmer. I'm abivalent on exceptions as they're implemented in C++ (I think they're done ok in python and SML, and their absence is done well in C).
However, on top of something very useful, C++ also adds object-orientation, the problems of virtual/non-virtual inheritance, member pointers, template template arguments (who use these?), horrible iostream syntactic conventions. In other words, C++ loses the simplicity and compactness C and python contains.
Not that it took me very long to learn the feature set. I wrote a working, usable fetchmailconf, with GUI, in six working days, of which perhaps the equivalent of two days were spent learning Python itself. This reflects another useful property of the language: it is compact--you can hold its entire feature set (and at least a concept index of its libraries) in your head. C is a famously compact language. Perl is notoriously not; one of the things the notion ``There's more than one way to do it!'' costs Perl is the possibility of compactness.
So, in the end I somewhat dislike C++ because of it's lack of compactness, and it's verbosity of templates (SML does templates--`type genericity'--much more elegantly and tersely). Then again, C++ doesn't impose on me--it doesn't stop me from doing what needs to be done. Also, a strength it shares with python and (to a lesser extent) C is its capability of multiparadigm programming (although C++ is best at proceducal and OO, C is best at procedural and OK at OO, python is best at OO and fine at procedural, they can all do some functional `tricks'). Also, given the close-to-the-hardware'ness of C++, I miss a statically sized bit array, and a guarantee that vector<bool> uses only one bit per element (or as close as possible). Also, one thing I slightly dislike about C++ is that strings aren't just a vector<char>:
Most data structures exist because of speed. For example, many languages today have both strings and lists. Semantically, strings are more or less a subset of lists in which the elements are characters. So why do you need a separate data type? You don't, really. Strings only exist for efficiency. But it's lame to clutter up the semantics of the language with hacks to make programs run faster. Having strings in a language seems to be a case of premature optimization.
Now, let's have a look at lisp and scheme.
I have limited experience with lisp/scheme, and I haven't truly loved them, although I can see why some people would like them: they offer flexibilty and some (in the words of RMS) mathematical cleanliness. It's of course the king of flexibility with the macro system being able to replace any macro invocation with an arbitrary parse tree, something cpp can't (although cpp can do 80% of the job with 20% of the effort--all in the best `worse is better' style).
Let me restate a point:
You rarely need arbitrary macro replacement, but when you do, it REALLY hurts not to have it
Then again, what lisp has in flexibility, it loses in brevity: I really hate writing vector-ref. However, that is not inherent to lisp, and Paul Graham suggests a way to do away with that, with an s-expression I consider just about short enough: (collection i). That is, call the collection with the index (for great subsequence extraction powers with map): see http://www.paulgraham.com/arcll1.html
However, I also dislike the way my code looks (see also http://www.paulgraham.com/pypar.html)--not that I mind the paranthesis, I just don't think there's any way to indent the code such that it's both pretty and doesn't use my whole screen for three levels of nesting. This might be because I'm working against the language instead of with it--admitted, I probably don't truly grasp functional programming--but this *is* a problem, and I'd rather write SML (which has *no* macro system) than any lisp I know currently (elisp, PLT scheme).
Also, if the books on functional programming are representative for the languages, the paradigm is only good for writing metacircular interpreters :P
In any case, I still think lisp isn't getting in the way, some of it is just getting in my way.
All right, next victim is perl.
I haven't written a whole lot of perl, but I've read some and successfully added a feature to a script (update-alternatives, a debian tool).
So, what was my experience? For one, I really hate that one has to write braces around single-statement-body compound statements (i.e. if, while, for). Yes, one can write them as statement modifiers, but one can only get a subset of the effect, AIUI (How does one do multiple if/else?). But that's an acceptable syntactic issue. Another syntactic issue is the glyphs (what the &@$% was Larry thinking?), but I can possibly accept that.
What I really dislike is the distinction between list/scalar context. What does `reverse reverse "hello"' mean in scalar context? In list context? It's not the same. That, plus the use of default variables costs perl a lot in readability, which is much more important than people think. Think about this: a lot of time is spent modifying and debugging code, and a lot less writing it in the first place, during the lifetime of a software project that actually `takes off'.
Then again, perl is pragmatic too: "print while(<>)" is a bare-bones cat(1). That extreme brevity may not extend completely to every use of the language, but let's just say that perl is (or at least can be) terse. So it does make for short programs.
So, let's have a look at the final language of this installment, bash.
bash is, as you know, the default interactive shell for about 90% of all GNU/Linux systems. And for small hack-it-together uses it's quite good:
for i in $(seq 1 100); do wget http://not.exists.org/file$(printf "%02d" $i).ext/; done &
Writing that in just about any other language is bound to be longer. So here we see the useful `library' of bash.
However, that library is not guaranteed to exist on every platform, it's not guaranteed to be usable in the same way, and you're not guaranteed any consistent initial state (updatedb has been run recently? Don't count on that). There's not even a guarantee that bash is available, it might be some other posix-compliant shell.
So, of course, there's the GNU autotools which outputs portable configure scripts to make your C programs more portable, but still... it says a lot about shell scripting that portable scripts are best generated.
So, that concludes my brief tour of other programming languages I like less than C and python.
Let me summarize, and explain why I like them better than java.
java (esp. java 5) is pretty close to C++: OO, exceptions, templates, strong static manifest (i.e. `spelled-out') types. However, C++ doesn't impose on you--it trusts you to only shoot yourself in the foot when you have to.
Then there's lisp. Lisp is the king of flexibility at the expense of brevity. Java, too, is very verbose (`ArrayIndexOutOfBoundsException'? Not `IndexError'?), but it's not a trade-off, it's just a sacrifice--I don't think I gain anything by using java over lisp.
Then perl: perl loses some readability, which java also loses through it's (blantantly stupid) only-one-public-class-per-file rule--it takes a lot of programmer time in getting into the deep focus when one has to switch file all the time. However, java doesn't buy me pragmatism (show me a one-statement cat).
Finally, there's bash: it matches java wrt. stdlib, and loses out on portability, but it buys a whole lot more of two-line hacks than java can ever dish out--which is a good trade for the domain of bash.
So, I'd prefer C or python to each of these four for most projects, I think these are good choices for some programs and/or people. Bash for the two-line hacks, perl for the 100-line hacks, and C++ for the 1M-line `hacks', and lisp for those who like lisp.
But C beats java on speed, and python beats java on code size, so I don't still don't think java is the best tool for any job. At least not for me. Maybe if one really likes java (but I can't see that happening).
And just to make sure I don't forget to mention it: Paul Graham has a lot to say about java's cover on http://www.paulgraham.com/javacover.html -- and I agree especially on the hyped part: good languages don't have to be hyped, they get used because they are good.
Anyways, again, this is just my opinion. If you like java, more power to you. As long as your code is free and can be compiled and run on a free compiler and VM.
As a side note to Mega Man X, C can be compiled for PICs, some of which have only 256 bytes of RAM. Of course, it's a very limited version of C (not the real C by far), without the stdlib, but still...