Don't feel too embarrassed about doing it the hard way. We all experience that kind of thing sometimes. That's what helps you learn. Besides, I didn't really know myself whether you really intended your final output to be like that, or if you were just keeping it simple for this discussion. That's why I discussed the loop on detail first before mentioning it.
You don't need -print0
when using find
alone because you aren't sending the output anywhere for another program to read. Instead you're executing du
directly from within find
, ignoring the shell entirely.
The basic find
syntax is this:
find <starting dirs> <global options> <matching expressions> <actions>
Things like -print
; they do something using the files matched by the previous expressions. the various print options send the list to stdout, while the -exec
option runs an external command on them. In this case -exec
runs one instance of du
for each file found, and its du
that's printing its output to the screen according to its usual behavior.
Re: your question about changing IFS, no you are not changing the global setting this way. Any simple command directly preceded by a variable setting (with only whitespace between them) will be launched with that variable in its environment, while not affecting the parent environment.
IFS='' read -rd '' dir <input
In this case the IFS
setting only affects the action of read
. You could instead change the global environment IFS and it would work too, but then you have to remember to set it back to the default afterwards. This is much cleaner.
( Edit: There's one final thing about IFS+read
that I need to mention. When IFS includes whitespace, it will remove any of those whitespace characters found at the beginning or end of the string, even if there's no other word-splitting being done to the input (i.e. for setting a multiple variable list or array). Giving read
an IFS set to null disables this so that you're guaranteed to get the entire input unaltered. )
As for your previous problems, when referring to post #7, the key issue is here:
arr_dir=(`find "/home/myadmin" -maxdepth 1 -type d`)
It's the same old word-splitting trouble again. The "`..`
" command substitution first executes find
and inserts its output into the array-setting command. THEN word-splitting occurs on that output, and the array ends up setting one index per word, rather than the desired one-per-file.
There is no easy way to work around this kind of situation. When expanding variables and command substitutions you can only have it treated as a single word (by quoting it), or have it word-split according to the IFS value. There's no way to tell it which spaces are inside the filenames and which are separating them. That's why you have to use null separators and a loop.
Note that if the delimiter in the ouptput was a non-whitespace character, such as a colon, we could use IFS to split it safely in this fashion. But null separators don't work here, as I discovered in post #14.
As for post #8, it's mostly ok actually, except for this:
done < <(`find "/home/myadmin" -maxdepth 1 -type d -print0`)
You have a command substitution
inside a process substitution
. As far as I can tell this should not work at all. The p.s. subshell would attempt to treat the expanded file list as a command and try to execute it directly, which would no doubt fail. Remove the backticks and it will probably work.
I'd also point out that this is a good example of why "$(..)
" is better than backticks too. It would be easier to see the improper nesting.
For a better understanding of IFS, start with the first three links to get a better understanding of how the shell processes arguments and whitespace internaly, and then the fourth one for more on how IFS works specifically.
Finally, having a good understanding of the shell's parsing order, what happens before or after what, will help you to avoid many mistakes like this.