LinuxQuestions.org
Visit Jeremy's Blog.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices

Reply
 
Search this Thread
Old 11-25-2008, 02:13 PM   #1
nshewmaker
LQ Newbie
 
Registered: Jul 2006
Distribution: Ubuntu
Posts: 9

Rep: Reputation: 0
Question bash script to find files with shebang and then chmod


I want to look for all the files in a directory, and examine the first line of each for a shebang ("#!"). I can get close with the following command:

Code:
find dir/ -user $UID -type f -exec grep --files-with-matches --binary-files=without-match --max-count=1 --perl-regexp "^#\!\s*?\/" {} \; | xargs -r chmod -cf 755
This works, but it inefficiently examines the entire file until the first match is found, if any. Additionally, the shebang could appear on any line, which is invalid. I'd like to use the "head" command to crop only the first line, but I don't know how to do that in one step, keeping the filename needed for the second part of the operation (chmod).

I could handle this with a "higher-level" script, but I'd like to stick with bash, if possible. Any bash shell gurus out there with a solution?

Last edited by nshewmaker; 11-25-2008 at 02:15 PM. Reason: added code tags
 
Old 11-25-2008, 03:28 PM   #2
0.o
Member
 
Registered: May 2004
Location: Raleigh, NC
Distribution: Debian, Solaris, HP-UX, AIX
Posts: 208

Rep: Reputation: 35
Quote:
Originally Posted by nshewmaker View Post
I want to look for all the files in a directory, and examine the first line of each for a shebang ("#!"). I can get close with the following command:

Code:
find dir/ -user $UID -type f -exec grep --files-with-matches --binary-files=without-match --max-count=1 --perl-regexp "^#\!\s*?\/" {} \; | xargs -r chmod -cf 755
This works, but it inefficiently examines the entire file until the first match is found, if any. Additionally, the shebang could appear on any line, which is invalid. I'd like to use the "head" command to crop only the first line, but I don't know how to do that in one step, keeping the filename needed for the second part of the operation (chmod).

I could handle this with a "higher-level" script, but I'd like to stick with bash, if possible. Any bash shell gurus out there with a solution?
Code:
for i in `find dir/ -user $UID -type f`;
     do
          t=`head $i | grep #!`;
                if [ $? -eq 0 ];
                     echo $i;
     done
 
Old 11-25-2008, 07:51 PM   #3
uberNUT69
Member
 
Registered: Jan 2005
Location: Tasmania
Distribution: Xen Debian Lenny/Sid
Posts: 578

Rep: Reputation: 30
find dir/ -user $UID -type f | while read fn; do head -n1 "$fn" | grep -q "^#\!" && echo "$fn" && chmod 755 "$fn"; done
 
Old 11-26-2008, 09:34 AM   #4
0.o
Member
 
Registered: May 2004
Location: Raleigh, NC
Distribution: Debian, Solaris, HP-UX, AIX
Posts: 208

Rep: Reputation: 35
Quote:
Originally Posted by uberNUT69 View Post
find dir/ -user $UID -type f | while read fn; do head -n1 "$fn" | grep -q "^#\!" && echo "$fn" && chmod 755 "$fn"; done

That's going to output the results of the grep command and head. I find it much easier to redirect that to a temp variable and output just the file name.
 
Old 11-26-2008, 10:04 AM   #5
Telemachos
Member
 
Registered: May 2007
Distribution: Debian
Posts: 754

Rep: Reputation: 59
This appears to work without the overhead of find (seems like overkill to me here), but frankly this would be trivial in Perl (or many other languages), so I'm not sure why stick with "pure" Bash.
Code:
#!/bin/bash
for item in testy/*
do 
  if [ -f $item ]; then 
    head -n1 $item | grep -q '^#!'
    if [ $? -eq 0 ]; then
      chmod +x $item
    fi
  fi
done
 
Old 11-26-2008, 10:30 AM   #6
uberNUT69
Member
 
Registered: Jan 2005
Location: Tasmania
Distribution: Xen Debian Lenny/Sid
Posts: 578

Rep: Reputation: 30
Quote:
Originally Posted by 0.o View Post
That's going to output the results of the grep command and head. I find it much easier to redirect that to a temp variable and output just the file name.
You haven't tried it.

In my suggestion above:
a) stdout from head is piped into grep, and grep is silent (-q). (only the status from grep is used)
b) the filename is echoed for your "convenience" (ie: not necessary!)


Quote:
Originally Posted by Telemachos View Post
This appears to work without the overhead of find (seems like overkill to me here), but frankly this would be trivial in Perl (or many other languages), so I'm not sure why stick with "pure" Bash.
Because it's a one-(command-)liner .

Quote:
Originally Posted by Telemachos View Post
Code:
#!/bin/bash
for item in testy/*
do 
  if [ -f $item ]; then 
    head -n1 $item | grep -q '^#!'
    if [ $? -eq 0 ]; then
      chmod +x $item
    fi
  fi
done
except that:
a) * is not recursive
b) your solution doesn't check for files owned by $USER
c) you can have situations where you get "too many arguments"
... (ie: * creates command line exceeding buffer if too many files).

your solution is still a one-liner (filenames quoted)

$ for item in dir/*; do [ -f "$item" ] && head -n1 "$item" | grep -q '^#!' && chmod +x "$item"; done
 
Old 11-26-2008, 11:15 AM   #7
nshewmaker
LQ Newbie
 
Registered: Jul 2006
Distribution: Ubuntu
Posts: 9

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by Telemachos View Post
This appears to work without the overhead of find (seems like overkill to me here), but frankly this would be trivial in Perl (or many other languages), so I'm not sure why stick with "pure" Bash.
I fully intend to replace the bash script needing this bit of code with a Perl one. However, it's used in a delicate way, and I'd rather fix this single problem instead of potentially starting a new set of them.
 
Old 11-26-2008, 11:28 AM   #8
Telemachos
Member
 
Registered: May 2007
Distribution: Debian
Posts: 754

Rep: Reputation: 59
Quote:
Originally Posted by nshewmaker View Post
I fully intend to replace the bash script needing this bit of code with a Perl one. However, it's used in a delicate way, and I'd rather fix this single problem instead of potentially starting a new set of them.
If that works for you, cool. I would find it easier to use one tool and make the solution work there, rather than throwing together a one-liner now and fixing it later (with a different language).
 
Old 11-27-2008, 02:38 PM   #9
nshewmaker
LQ Newbie
 
Registered: Jul 2006
Distribution: Ubuntu
Posts: 9

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by uberNUT69 View Post
find dir/ -user $UID -type f | while read fn; do head -n1 "$fn" | grep -q "^#\!" && echo "$fn" && chmod 755 "$fn"; done
This works great! Thanks, uberNUT. I'm glad you understand the appeal of one-liners.
 
  


Reply

Tags
bash, find, grep, shell


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Using Bash, Find script files in a directory or subdirectories within... ray5_83 Programming 4 10-10-2008 07:42 PM
bash script to find the difference between the 4th element of each line in 2 files dontbugmeplz Programming 6 07-21-2008 08:05 AM
how to find files using bash script prernasin Linux - Newbie 6 09-26-2007 08:57 AM
A bash script to find duplicate image files fotoguy Programming 7 01-25-2007 06:47 PM
Trying to find the files that contain distro information for my bash script ethics Linux - General 5 07-04-2006 04:48 AM


All times are GMT -5. The time now is 11:36 AM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration