Need 'cut' with mulit char delimiter
I need to tokenize a very long line of characters. Cut would work perfectly, however I need to use more than one character for the delimiter.
Does anybody have any ideas? I don't have perl, but I do have most of the standard linux command line tools available: cut, sed, awk ... Thanks |
Well, I'm probably not familiar with many programming term/jargon, but could you provide a more concrete example of what needs to be done ? I personally don't get it.
|
You can try to use awk, since the Field Separator can be a single character or a regular expression. For example if I have
Code:
$ cat testfile Code:
$ awk -F"delim" '{print $1; print $2; print $3}' testfile |
awk
Hi,
I just had a quick play with awk, looks the the field separator option can take multiple arguments!? e.g. ',' and '=' using OR .. Code:
cat inputfile | awk -F \(,\|=\) '{print $2}' HTH, zer0x |
Moin,
you mean, the delimiter is not a multi character string, but is some times one character, another time a different one? In ths case the simpliest way would be to redefine IFS (Input Field Separator) and then use a simple loop: Code:
jan@jack:~/tmp> IFS=',= |
cut's delimiter must be a single character. You can replace your long character list to a special character first, then use cut.
|
All times are GMT -5. The time now is 04:33 AM. |