Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I've been using linux in one job or another for about 10 years. I used to be a C programmer, and I'm decent with perl and passable with ruby. These days, I do a lot of text data analysis. I tend to work on fairly short term projects, generally too short and too limited to justify writing a perl script, but I do need to work fast. I work with lots of different kinds of text data: fixed width files, CSV files, delimited files, xml files...
I'm always looking for slick little time savers.
chunks of my day are spent writing something like this at the command line:
Code:
for i in $(blah)
do
foo $i | grep bar | sort | uniq -c | grep -v " 1\>"
done
My latest find is process substitution in bash.
here's what the man pages have to say about process substitution:
Code:
Process Substitution
Process substitution is supported on systems that support named pipes
(FIFOs) or the /dev/fd method of naming open files. It takes the form
of <(list) or >(list). The process list is run with its input or out-
put connected to a FIFO or some file in /dev/fd. The name of this file
is passed as an argument to the current command as the result of the
expansion. If the >(list) form is used, writing to the file will pro-
vide input for list. If the <(list) form is used, the file passed as
an argument should be read to obtain the output of list.
When available, process substitution is performed simultaneously with
parameter and variable expansion, command substitution, and arithmetic
expansion.
Here's what this means in real life:
I have two text files which are sorted differently, let's call them 'a' and 'b'. I've done a 'wc -l' on the files, and I know that file b is 5 lines longer than a. In the past, I've always had to sort a and b into a couple of temp files, say a.tmp and b.tmp, then diff a.tmp and b.tmp. Using process substitution, I can eliminate the need for the temp files:
Code:
diff <(sort a) <(sort b)
so... I'm fishing around for other people's little time savers...
I spend quite a lot of time on the shell, and I absolutely love how the linux shells work. Now, I wonder how did I survive my DOS years with such a limited functionality. But well, to the topic.
These are some of the little things I use on my bashrc.
Code:
# Function to add a path
# Usage: add_to_path <path>
add_path () {
if [ "$PATH" == "${PATH/$1/}" ]
then
PATH="$1:$PATH"
export PATH
fi
}
# Function to search on the bash history
function recal () {
if [ -z "$1" ]; then
echo "Uso: recal <cadena>, para buscar <cadena> en el historial de bash."
else
history | grep "$1" | grep -v recal
fi
}
# Function to rename files to lower case
function lowcase () {
NEW_NAME="$(echo "$1" | tr '[A-Z]' '[a-z]')"
if [ ! "$1" == "$NEW_NAME" ]
then
mv "$1" "$NEW_NAME"
fi
}
# Function to join two parts of a movie. Both must be encoded using the same codec,
# otherwise you are going to have problems to play it in anything that's not mplayer.
# This just copies the streams and recalculates the indexes, no re-encoding is done,
# and hence this is quite fast.
function avijoin () {
local part1="$1"
local part2="$2"
local out_f="$3"
mencoder -oac copy -ovc copy -idx "$1" "$2" -o "$3"
}
That, and a bunch of aliases which saves me some typing.
I also use a lot the AND and OR operators for many purposes, including when I don't want to bother an if...else...fi block for something simple.
Code:
$ [ -h /bin/sh ] && echo "/bin/sh is a symlink" || echo "/bin/sh is not a symlink"
/bin/sh is a symlink
Nice and neat. It saves some typing as well.
Another small trick I use quite a lot is "exec bash", when I want to reload .bashrc or something like that. It will close the actual shell and spawn a new one. That way you don't need to close and login again, or close urxvt and launch it again.
# Function to add a path
# Usage: add_to_path <path>
add_path () {
if [ "$PATH" == "${PATH/$1/}" ]
then
PATH="$1:$PATH"
export PATH
fi
}
I also use a lot the AND and OR operators for many purposes, including when I don't want to bother an if...else...fi block for something simple.
Code:
$ [ -h /bin/sh ] && echo "/bin/sh is a symlink" || echo "/bin/sh is not a symlink"
/bin/sh is a symlink
Nice and neat. It saves some typing as well.
Another small trick I use quite a lot is "exec bash", when I want to reload .bashrc or something like that. It will close the actual shell and spawn a new one. That way you don't need to close and login again, or close urxvt and launch it again.
I like the add_to_path function. It's so simple, I had a "why didn't I think of that? moment.
I do use && and || fairly extensively.
When you use "exec bash", does it leave your history intact?
I was looking through the bash man pages today to see if there were any other nice little tidbits in there. I did run across this, which I will use:
Code:
mkdir -p {a,b,c}/asdf
This will create three directory trees:
a/asdf
b/asdf
c/asdf
Where I work, we have a production environment 'PROD' and a pre-production environment 'PREP'. I will often want to diff the same file between prep and prod, which have the same directory structure (both mounted at the root directory), using the same feature, I will be able to do this:
I like the add_to_path function. It's so simple, I had a "why didn't I think of that? moment.
I do use && and || fairly extensively.
When you use "exec bash", does it leave your history intact?
It depends on what do you mean. It reloads bash, it's, to all effects, like if you open a new terminal with a new bash session. That means that the history will be reloaded, with all the latest additions.
When you use many shells, what you do on the shell on, let's say, termA, doesn't magically appear in the history for termB, and so on. But if you open a new bash session, then the commands that you ran on termA and termB are available in the history of this new session.
Of course, if you use the exec bash trick, then "exec bash" itself will be added to the history as well
I was looking through the bash man pages today to see if there were any other nice little tidbits in there. I did run across this, which I will use:
Quote:
Where I work, we have a production environment 'PROD' and a pre-production environment 'PREP'. I will often want to diff the same file between prep and prod, which have the same directory structure (both mounted at the root directory), using the same feature, I will be able to do this:
diff /{prep,prod}/path/to/some/files
Yep. Bash is really powerful.
Another thing you might want to look at is into bash string mangling capabilities if you haven't already. That will save you quite a few sed's and stuff in your scripts.
I started playing with string mangling last night (known as 'Parameter Substitution' in the man pages). Parameter substitution acts as follows: given a variable, let's say '$foo' the parameter substitution will act on or modify the variable.
For instance ${foo:start:len} acts as a sub string operator:
${parameter:-word}
Use Default Values. If parameter is unset or null, the expan-
sion of word is substituted. Otherwise, the value of parameter
is substituted.
${parameter:=word}
Assign Default Values. If parameter is unset or null, the
expansion of word is assigned to parameter. The value of param-
eter is then substituted. Positional parameters and special
parameters may not be assigned to in this way.
${parameter:?word}
Display Error if Null or Unset. If parameter is null or unset,
the expansion of word (or a message to that effect if word is
not present) is written to the standard error and the shell, if
it is not interactive, exits. Otherwise, the value of parameter
is substituted.
${parameter:+word}
Use Alternate Value. If parameter is null or unset, nothing is
substituted, otherwise the expansion of word is substituted.
'parameter' is the variable that you're testing, 'word' is subject to tilde expansion, command substitution, and arithmetic expansion.
so:
let's say that $foo is unset.
Code:
$ echo ${foo:-`echo "Hello World"`}
Hello World
now, let's set $foo, and then run the same substitution:
Here's another one that I thought was kind of nice:
I have to wait on people to upload files to me via FTP {ftps, scp, http...} from time to time. Let's say that I know that the file will be named 'Population.txt' or 'population.txt'
Code:
while [ true ]; do [ -f [Pp]opulation.txt ] && echo -e '\0007' ; sleep 2; done
until the file arrives, it just sits there. Minimize the terminal, and it's out of site, out of mind. Then, when the file arrives, it rings the terminal bell every 2 seconds.
Here's another one that I thought was kind of nice:
I have to wait on people to upload files to me via FTP {ftps, scp, http...} from time to time. Let's say that I know that the file will be named 'Population.txt' or 'population.txt'
Code:
while [ true ]; do [ -f [Pp]opulation.txt ] && echo -e '\0007' ; sleep 2; done
until the file arrives, it just sits there. Minimize the terminal, and it's out of site, out of mind. Then, when the file arrives, it rings the terminal bell every 2 seconds.
I don't use this as much as I did six months ago. If' I'm waiting for a particular condition, I'm more likely to use something like this:
Code:
while ! condition
do
sleep 2
done; Echo "Blah Blah Blah just happened" | mail bchittenden@foo.com -s "Blah Blah Blah"
This allows me to minimize a terminal window, or even run the whole process in the background, then get an email when the condition is true (note that 'condition' above would be a statement in valid shell syntax which can evaluate to true or false).
If I actually want to see what's going on (for instance watch the size of a file grow), I'll use the 'watch' command, which executes a given command every n seconds (default is 2). It refreshes the screen at every new execution, and generally looks slick.
To monitor files you should look into inotifywatch and inotifywait, part of inotify-tools. Linux-only stuff though. That eliminates the need to do periodical checks.
You could monitor a file or a whole dir for modification and send a mail each time a given file is modified, or moved, or whatever else.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.