LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - News > Syndicated Linux News
User Name
Password
Syndicated Linux News This forum is for the discussion of Syndicated Linux News stories.

Notices


Reply
  Search this Thread
Old 05-28-2013, 03:51 PM   #1
LXer
LXer NewsBot
 
Registered: Dec 2005
Posts: 109,143

Rep: Reputation: 117Reputation: 117
LXer: Optimizing BASH Scripts


Published at LXer:

Shell script writers may write a script that runs slowly, but the programmer cannot figure out what code is causing the issue. Even if they find the problem, they may have difficulties finding a better way to write the code. This article will show inefficient code and how to write efficient script. This article will also show methods of writing code to make the code execute quickly.

Read More...
 
Old 05-30-2013, 02:37 PM   #2
David the H.
Bash Guru
 
Registered: Jun 2004
Location: Osaka, Japan
Distribution: Debian + kde 4 / 5
Posts: 6,846

Rep: Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007
A bit of good advice, but horrible examples. And it didn't go nearly deep enough into what's possible.

To start with, it would've been better to state more clearly the Useless Use Of Cat, and not continued to use it in the subsequent examples.

Even better I would stress the importance of avoiding external process calls of most kinds whenever possible. All the sed examples at the end could've been replaced with parameter substitutions, saving a process or two. And don't call something like date 20 times; run it once, save the output to a variable, and use that. You could even save multiple values in a single variable or array, and use the shell to print out only the ones you need at that time.

Interestingly, in my testing, looping through small files (up to about .5-1 kilobyte, depending on the exact operation) in the shell and processing every line can often be faster than doing the same action with sed or awk.

Code:
#on a file that contains 20 lines of "foo bar baz" (244 bytes)

$ time { while read -r line; do echo "${line/foo/FOO}"; done <file.txt ;}
FOO bar baz
FOO bar baz
....etc....

real    0m0.004s
user    0m0.004s
sys     0m0.000s

$ time sed 's/foo/FOO/' file.txt
FOO bar baz
FOO bar baz
....etc....

real    0m0.008s
user    0m0.000s
sys     0m0.000s
sed started to perform faster than the loop at about 50 lines on my system. But the following variation was still highly competitive even at 100 lines:

Code:
$ time { mapfile -t lines <file.txt ; printf '%s\n' "${lines[@]/foo/FOO}" ;} 
FOO bar baz
FOO bar baz
....etc....

real    0m0.009s
user    0m0.000s
sys     0m0.004s
 
Old 05-31-2013, 03:23 AM   #3
H_TeXMeX_H
LQ Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
Blog Entries: 2

Rep: Reputation: 1292Reputation: 1292Reputation: 1292Reputation: 1292Reputation: 1292Reputation: 1292Reputation: 1292Reputation: 1292Reputation: 1292
I agree with not using cat unless you need it. Checking my scripts, I only use it once to cat a file to be viewed on the terminal.

I tend to write bash scripts using as many external utilities as possible, because they are faster, especially for large amounts of data. For smaller amounts, bash can be faster like you have shown. I would also consider the readability of the code ... I would go with the external utilities. I mean the sed line is much easier to read.
 
Old 05-31-2013, 12:22 PM   #4
David the H.
Bash Guru
 
Registered: Jun 2004
Location: Osaka, Japan
Distribution: Debian + kde 4 / 5
Posts: 6,846

Rep: Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007Reputation: 2007
Compiled external programs may be faster in themselves, but this is balanced against the need for the shell to spawn extra processes for them.

My usual advice is to mostly use text-processing tools like grep/sed/awk when you have to operate on large blocks of text en-masse, or if you have complex operations that the shell can't do easily. Pre-filtering lines with grep before reading them into the shell is a good use of it, for example. But once a text string is stored in a variable, it's nearly always more efficient to use in-shell manipulations like parameter substitution. Remember the shell interpreter is also a compiled program, and its individual built-in operations are generally just as efficient at what they do as the others.

As for readability, or to be more precise comprehensibility, there is a rather large subjective component involved, and a lot of it depends on experience. I personally find shell code to be no more difficult to understand than the syntax of sed or awk, and it can often be easier to figure out than some of the more complex expressions of the latter.

I will admit that, given a choice between a sed one-liner and a 10 line shell function that does the same thing, it often make sense to go with the former. But this kind of thing is not at all acceptable in my opinion:

Code:
# uggh!
var=foobarbaz
var=$( echo "$var" | sed 's/baz$//' )
echo "$var"

# yes!
var=foobarbaz
var=${var%baz}
echo "$var"
I find the in-shell version to be both cleaner and easier to read, and it's almost infinitely faster (it's hard to beat 0).

And of course, the topic at hand here is optimization, not readability. When milliseconds count, sed it out, at least in kind of operations like the above.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
LXer: Python Scripts as a Replacement for Bash Utility Scripts LXer Syndicated Linux News 1 01-17-2013 08:08 AM
optimizing bash process kendosan Programming 15 10-13-2012 08:04 AM
Optimizing a Bash Script zokken Programming 3 04-13-2011 12:29 AM
LXer: Internationalizing Those Bash Scripts LXer Syndicated Linux News 0 09-27-2010 02:40 PM
optimizing this bash script yah0m Programming 6 04-17-2009 07:42 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - News > Syndicated Linux News

All times are GMT -5. The time now is 07:00 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration