LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Programming (https://www.linuxquestions.org/questions/programming-9/)
-   -   How do I create multiple variables from a list in Bash? (https://www.linuxquestions.org/questions/programming-9/how-do-i-create-multiple-variables-from-a-list-in-bash-681655/)

Passions 11-06-2008 07:12 PM

How do I create multiple variables from a list in Bash?
 
Can anyone help me out. Let's say I have a simple txt file called, girls.txt:

Janet
Harriet
Jill
Gertrude


I want to be able to assign each girl a variable. So far I have:

i=1
for names in `cat girls.txt`;do
eval list$i=$names
i=`expr $i + 1`
done

I get the correct names, when I manually enter:
echo $list1
echo $list2

But if I add "echo $list$i" it freaks out. I want to throw that into the for loop so I can just spit out all the names.

i=1
for names in `cat girls.txt`;do
eval list$i=$names
echo $list$i
i=`expr $i + 1`
done


When I execute this instead of the girls names, I get:

1
2
3
4


Any help? Thanks!!!

billymayday 11-06-2008 07:22 PM

[]

So

list[1]=Janet

echo ${list[1]}

Note the {} in retrieval

Kenhelm 11-06-2008 10:04 PM

Try:-
eval echo \$list$i
But, as billymayday has suggested, it might be better to use an array.

Passions 11-07-2008 11:50 AM

Quote:

Originally Posted by Kenhelm (Post 3334128)
Try:-
eval echo \$list$i
But, as billymayday has suggested, it might be better to use an array.

Perfect! That solved my issue. :newbie: Cheers!

the dsc 05-19-2009 03:12 AM

How can we tweak this so the variables from the list can have spaces, like:

Charlie Brown
John Stuart Mill
Steve Martin


and so on?

The way it is it gets each word as a variable, ideally it would be an entire line.


(thanks in advance.... if I make it, it will help *a lot* undeleting (with ext3grep) tons of files I've accidentally deleted... and additionally could also improve a bit a screensaver-like-wallpaper-changer script I have :) )

ghostdog74 05-19-2009 03:53 AM

what do you want to do actually?

the dsc 05-19-2009 10:47 AM

I have a list of files I can potentially recover with ext3grep, which is somewhat like:

/dir/file.ext
/dir2/file with spaces in its name.ext
...

and with a script similar with this one I could replace the "echo" with something like ext3grep --restore-file /dir2/file with spaces in its name.ext

But that would require the script to somehow get the variables from lines, not just words, OR maybe I could somehow replace all the spaces by double-underlines or some unlikely character, and the script could somehow "decode" it, which would be probably even harder, I think.



This "undelete" program can theroretically "restore all", which is adequate if you have tons of files to recover and not only a few, but it's not working perfectly for me. It goes restoring for a little while, and then there's some obscure, highly-technical error which stops the process. But I still can recover files that were left behind, individually. A script with individual commands for each file would automate that for tons of files.



And, a OT warning, about the quick file listing filter on konqueror. If you filter for the filename "thrash" on a folder, and so various thrash1, thrash2, thrash3, [...] files appear, and then you select them all with control + a, and delete them, you will not be deleting just the files you're seeing selected, but all the files on the folder, and its subfolders, which were selected by control + a too, but are "hidden" from your sight. If it's a large number of files you wanted to delete, you may not even notice that the number/list is too large on the confirmation dialog... I didn't...

euroquisling 05-19-2009 01:46 PM

I use this mantra, it works with any filename:

Code:

some_command_that_outputs_list | while read x; do
 echo "$x";
 do_smth_with "$x";
done

Note the double quotes around $x, you have to have them in order to capture spaces and other weird characters in filename.

This is nice bc

1. no limitation for number of arguments, like you'd have by passing a large number of arguments to a command

2. it's as fast as shell gets

the dsc 05-21-2009 11:59 PM

It did work, thanks :)

If it's not too much to ask for more, would anyone know how to do something with this same basic principle, but instead of loading an item from a list, one by one, loading, say, three lines/items, assigning each as a different variable, so, instead of "do something with x", the command would be "do something with x, do something with y, do something with z".


I'll explain why, roughly. It's ext3grep, the program I'm using. In order to recover a file, it has first to do some sort of "scan" on the partition, and saves something into a file, a sort of "report" for its own use, which will be loaded when actually restoring something. This file is loaded once for each command line, and it takes some time. Not terribly slow for a single instance, like some seconds, but if we need to do this for many, many files, it quickly adds up to days, for something that could be otherwise finished in hours :/

I think I have to add another "layer of loop" in the second script, first it would just do somewhat more or less what it does, but somehow cycling some variables, and, after having done it N times...

In some basic-like language it would be something like:



0 n=1
1 some_command_that_outputs_list | while read x; do
x=a(n)
n=n+1
if n<5 then goto 1
else
command action a(1) action a(2) action a(3) action a(4) action a(5)
goto 0



It's quite frustrating knowing more or less the logic of what have to be done, being able to layout script-like rough, but at the same time not knowing the proper syntax and substitutes for things like "goto" and line numbers :/

chrism01 05-22-2009 12:51 AM

This sounds like you need to move up to a faster, more powerful lang like Perl: http://perldoc.perl.org/

Kenhelm 05-22-2009 01:41 PM

Try
Code:

n=1
some_command_that_outputs_list | while read a[n];do
  let n=n+1
  if [ $n -eq 6 ];then
    command action "${a[1]}" action "${a[2]}" action "${a[3]}" action "${a[4]}" action "${a[5]}"
    n=1
  fi
done

The variable 'a' is being used as a bash array.
'a[n]' is for setting the contents of the nth element of 'a'
'${a[n]}' returns the contents of the nth element of 'a'
http://tldp.org/LDP/abs/html/arrays.html
http://www.gnu.org/software/bash/man...ef.html#Arrays

If the list is in a file you can use this type of input to the loop:
n=1
while read a[n];do
...
done < filename

the dsc 05-22-2009 07:11 PM

Thanks Kenhelm!

Unfortunately it will not be directly useful to me in this task anymore, as, differently from what I've said before, even if we give multiple commands for the same instance, it will read a file for each action/file to recover.

But maybe I'm using an older version of ext3grep, since it was (I think) the author of this program himself who gave me this tip.

Besides adding to my list of personal "how to's" on scripts, I'll give a link to this discussion on the ext3grep list, so maybe it will be useful for people with a newer version, or maybe future versions, until it's finally "GUI-fied" or "ncursed".

the dsc 05-23-2009 05:23 PM

Me again.

Actually I've found a way to use the last script in a way that should speed things somewhat, not as fast as it would be if we could indeed restore many files loading the "stage2" file only once, like in "restore-all", but still something that could be useful until nothing better is implemented.

It's just the same script, but instead of giving the command multiple files to restore, it should give multiple independent commands.

Instead of:

ext3grep /dev/hda --restore-file ${a[1]}" --restore-file "${a[2]}" [...]

It should be:

ext3grep /dev/hda --restore-file ${a[1]}" & ext3grep /dev/hda --restore-file ${a[2]}" [...]


Theoretically it should be 2 times faster if there are two commands/instances, but eventually it should become slower by hardware/processing limits, I think, so I think it shouldn't have so many of them. I'm using about 7 or 8, I think.

I didn't really measured carefully the time gained or anything, I just have this impression by some simple, maybe imprecise tests with two simpler scripts running against a single script, with their results being saved on independent logs. Script 1a + script 1b logs had larger size than script2's log. This logic may be fundamentally flawed in some dull way that will made me ashamed of having posted it, but I think it could possibly be just like that.


All times are GMT -5. The time now is 08:10 PM.