ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Most of my career has been spent in "consulting," which means that I have encountered "installed production systems" in damn-near every language that is or was ever popular. I became extremely good at the art of "landing four-paws-down, no matter what it was," because I knew that the economics of computer software are always: "dance with the one that brung 'ya."
Perl is an extremely good dance partner ...
Now, you do not have to pretend to be an expert, at least going in, but you do need to be prepared.
In my assessment, the Perl language is not remarkable, but the "CPAN" contributed library definitely is. ("208,509 Perl modules in 43,174 distributions, written by 14,220 authors.") And that, in my opinion, is really "what the fuss is all about."
Furthermore, anytime you install anything from that library onto your system, it runs a very lengthy set of self-tests on your system, and will not willingly install the package if any of those self-tests fail. (It will recursively apply the same tests to every prerequisite or co-requisite.)
Check out their community web site: perlmonks.org. Like the language itself, it is one of the Internet's oldest developer web sites, and definitely quirky, but profoundly informative. And (like LQ ...), "with an archive that goes on forever."
Last edited by sundialsvcs; 02-10-2022 at 12:01 PM.
Perl enthusiasts with long memories may get a kick out of this. I wrote it a few years back.
Code:
#! /usr/bin/perl
use warnings;
use strict;
use File::Basename;
my $school = 'Randal Schwartz U';
# Study Perl docs and FAQs to become a Perl hacker.
open(my $study,'-|',basename($^X).'doc '.basename($^X).'faq1') or die 'Flunked';
while(<$study>) {
/"([^"]*)".*$school/i && print $1,"\n";
}
Distribution: Ubuntu based stuff for the most part
Posts: 1,177
Rep:
There is some over priced version control software from HP called Teamsite that uses lots of Perl. You can get a paying job maintaining this for companies that bought into it. But then the same can be said about COBOL.
Speaking as someone who once wrote and taught a community college class in COBOL after working for a failed software-reengineering company that targeted COBOL, I can attest that there are still millions of installed-and-working applications in COBOL to this day. And, there is a reason why that particular language is still used.
For all of its verbosity, COBOL has a fairly-unique but well designed capacity to specify exactly how an arithmetic calculation is to be performed – as floating-point or in decimal, allowing for precise control of the handling of: "dollars and cents." Even when they run through billions of records (as these programs routinely do), errors do not accumulate.
The IBM System/360 and System/370 architectures (now "Z/OS") were very much designed as "COBOL machines," and they run such programs extremely efficiently. The algorithms customarily used by COBOL for massively large data-processing tasks still work as well as they always did. The language is easy to understand on a superficial level but it does have subtleties that take a little longer to grasp.
As an example of the sort of data-sets I'm talking about: "a record of every prescription that was ever written in the United States over the past ten years." It isn't in an SQL database. COBOL gets the job because it's hands-down the fastest and most reliable and precise way to do it.
Last edited by sundialsvcs; 02-10-2022 at 12:13 PM.
For all of its verbosity, COBOL has a fairly-unique but well designed capacity to specify exactly how an arithmetic calculation is to be performed – as floating-point or in decimal, allowing for precise control of the handling of: "dollars and cents." Even when they run through billions of records (as these programs routinely do), errors do not accumulate.
That is an extremely interesting thing. Actually I think it is not related to cobol [only], but other languages too, where the creators took care of it. It was more than 25 years ago when I learned about precision and efficiency of the implemented/applied [mathematical] method of different formulae.
Regarding perl I think it is tuned too to efficiently process text files - based on regex. Probably it could be effectively applied in artificial intelligence research too. But again, it looks like nowadays we have different requirements and different approach.
Correction, every computer language ever written was written to solve or correct a specific problem or issue.
COBOL was specifically written to handle documents and the computations to evaluate documents to create new summary or extraction documents at high volume quickly. IT was soon extended to manage other functions, along with that, to fulfill the requirements of the US Navy and Big Business. It has since been extended additionally to include object oriented concepts and massive data extractions. The math functions are precise, but not always efficient. Many core business and government engines on midrange and mainframe computers use COBOL, because it is still the best tool for that job.
BASIC and Pascal were written to support education, and to make training coders faster and more general in different ways. Both inspired improved languages and improved education.
FORTRAN was designed to solve mathematical problems. It is, arguably to this day still, the most elegant language for solving matrix mechanics. (Every physics student uses calculus to derive Einsteins general equation E=MCC. Einstein used Matrix Mechanics in a method that is computationally intensive: something that computers do better and faster than people.) The WEATHER people and people doing hard core computational analysis for orbital mechanics, Climate prediction, and Nuclear Physics are likely to use FORTRAN on a supercomputer to reduce the time to get good results from decades to hours.
PERL was originally a general purpose language, back when shells were pretty stupid and disk I/O was SLOW by today's standards. It allowed system managers to code one way and run it anywhere you could get a PERL interpreter without calling for external utilities reducing I/O and improving performance. Like, a LOT!
(Yes, I am geeky about languages. I still treasure my Dragon (Compiler construction) and Camel(PERL) books!)
PERL was never intended to replace FORTRAN, or COBOL. It was intended to improve utility and performance for the System Engineer and System Administrators. It did, and it still does.
The only languages that have virtually vanished have done so because newer and better tools and been created to solve the same problem. As long as a language is the best answer, it is unlikely to vanish.
There are now other tools to solve some of the same problems solved by PERL, but none of them are significantly better, just different. I do not expect it to vanish within my lifetime.
The magic in the COBOL language is fairly-buried in the PIC(TURE) clause. If you declare a variable to be, say, PIC 9999V99, then it will be computationally(!) represented by six decimal digits, with an implied decimal point indicated by the "V". All computations will be performed in decimal mode.
Whereas, if you declare it to be COMP(UTATIONAL), it will be floating-point – and you can further specify exactly what that means.
All of this magick is contained in the DATA DIVISION, not in the PROCEDURE DIVISION where all of the procedural logic is located. You cannot understand a statement such as ADD X TO Y GIVING Z without referring to exactly how each variable is defined in the Data Division.
In my educated opinion, COBOL is really designed, and today used, for one thing: massive volume data processing tasks which require absolute control over exactly how the arithmetic operations will be carried out, accompanied by maximum possible speed. There are plenty of such applications out there.
Perl has been referred to as "the Swiss Army Knife® of pragmatic data processing." It defined the reference standard for "regular expressions" that was adopted by everyone else, and it was the first language that I know of which had them. The people who built the language and who have worked on it since have had pragmatic reasons for doing so, and it clearly shows. The base interpreter is cleanly implemented and very fast on all platforms.
And, as I have said: "it really comes down to CPAN." A truly vast library of contributed code which self-tests itself on your system every time you install anything. You can "put up with a lot," in terms of the language, in order to get your hands on a battle-tested resource like that. "208,509 Perl modules in 43,174 distributions, written by 14,220 authors." Oh, wait: its "208,513" now.
Last edited by sundialsvcs; 02-10-2022 at 04:30 PM.
There are now other tools to solve some of the same problems solved by PERL, but none of them are significantly better, just different. I do not expect it to vanish within my lifetime.
Actually it is more or less generalized: you can have the same library/module set for all the "major" languages, like java, perl, python (and others), the common problems are usually solved on all of them (like create a web server, ipc, threads, connect to database, scientific calculations, logging, parsing/analyzing text, unit tests ....).
But don't forget, a real database engine, a real video editor, game engine, math engine or similar are still written in c/c++ and compiled to the target system (and can be used from java/perl/python/whatever too).
The magic in the COBOL language is fairly-buried in the PIC(TURE) clause. If you declare a variable to be, say, PIC 9999V99, then it will be computationally(!) represented by six decimal digits, with an implied decimal point indicated by the "V". All computations will be performed in decimal mode.
Decimal arithmetic fell out of fashion once binary floating-point hardware became standard on CPUs. The speed of hardware floating-point overwhelmed all other considerations.
Ed
Actually, EdGr, "this is emphatically not(!) the case!" Not when you are talking about "dollars and cents!"
If your task is to "add up an endless column of figures," and you do this using floating-point, then the errors can quickly grow to be very large ... because you are translating each input into a floating-point "equivalent," then translating the final sum back out.
Although most programming languages – including Perl – provide "decimal arithmetic" using some external package or packages, none other than COBOL (to my knowledge ...) built it right into the language itself ... and optimized the hell out of it.
Every microprocessor that I am aware of ... even the M6502 ... implemented a "decimal mode" as well as "binary." So, the hardware support has always existed, and I doubt that "speed" was ever a consideration between the two.
Yes: "BCD = Binary-Coded Decimal" has always been important, where each four-bit group represents either a decimal digit or a sign indicator, and arithmetic is performed using decimal rules. CPU hardware has supported both.
Last edited by sundialsvcs; 02-11-2022 at 11:40 AM.
sundialsvcs - I disagree. On modern CPUs, binary arithmetic is orders of magnitude faster than decimal arithmetic. An IEEE double has enough precision to calculate the US GDP to the nearest cent. That makes rounding errors unlikely for most dollars/cents calculations.
Ed
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.