LinuxQuestions.org
Visit Jeremy's Blog.
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices

Reply
 
Search this Thread
Old 04-03-2010, 05:52 PM   #1
smeezekitty
Senior Member
 
Registered: Sep 2009
Location: Washington U.S.
Distribution: M$ Windows / Debian / Ubuntu / DSL / many others
Posts: 2,234

Rep: Reputation: 184Reputation: 184
Get avg file size in kbytes


Using bash, is it possible to get the average file size of each file in a directory of ~2000 files?
 
Old 04-03-2010, 06:36 PM   #2
Johnney Darkness
LQ Newbie
 
Registered: Feb 2009
Posts: 1

Rep: Reputation: 0
here's a snippet that worked for me:

set $(du -s .) $(ls -1 | wc -l); echo $[$1/$3]


explanation: du -s . gives the answer "99999 ." where "99999" is the directory size in kb, "." is the directory echoed back) and ls -1 | wc -l counts the total files in "." . Set grabs the returned values as $1, $2, $3 . So $[$1/$3] returns the size over the count.
 
Old 04-03-2010, 07:33 PM   #3
smeezekitty
Senior Member
 
Registered: Sep 2009
Location: Washington U.S.
Distribution: M$ Windows / Debian / Ubuntu / DSL / many others
Posts: 2,234

Original Poster
Rep: Reputation: 184Reputation: 184
Code:
ls: write error: No space left on drive (ENOSPC)
bash: 41433/0: division by 0 (error token is "0")
Oh and welcome to linuxquestions.
 
Old 04-03-2010, 08:04 PM   #4
GrapefruiTgirl
Guru
 
Registered: Dec 2006
Location: underground
Distribution: Slackware64
Posts: 7,594

Rep: Reputation: 550Reputation: 550Reputation: 550Reputation: 550Reputation: 550Reputation: 550
Code:
#!/bin/sh

NUMFILES=$(find . -maxdepth 1 -type f | wc -l)
TOTAL=$((0))

for EACH in $(find . -type f -maxdepth 1 | ls -la | awk '{print $5}'); do
 TOTAL=$((TOTAL+EACH))
done

AVERAGE=$(echo "scale = 2; $TOTAL / $NUMFILES / 1000" | bc -l)

echo "Average file size: $AVERAGE Kbytes."
The above seems to work nicely for me. You may wish to double-check on a small directory with a couple files (and a calculator ). Also note that to change Bytes to KBytes, I've divided by 1000; you may wish to divide by 1024, depending on which sort of KBytes you want ( ). Also note that it won't recurse deeper than the current directory. If you want to include stuff deeper than the working directory, changing the -maxdepth value should do it (untested by me).

ahsaS
 
Old 04-03-2010, 10:39 PM   #5
smeezekitty
Senior Member
 
Registered: Sep 2009
Location: Washington U.S.
Distribution: M$ Windows / Debian / Ubuntu / DSL / many others
Posts: 2,234

Original Poster
Rep: Reputation: 184Reputation: 184
Quote:
Originally Posted by GrapefruiTgirl View Post
I've divided by 1000; you may wish to divide by 1024
Changed that.
Quote:
ahsaS
 
Old 04-04-2010, 06:33 AM   #6
grail
Guru
 
Registered: Sep 2009
Location: Perth
Distribution: Manjaro
Posts: 7,631

Rep: Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958
I tried Sasha's solution but had a couple of files with spaces in so the find blew up in my face.
So I came up with:

Code:
awk '$1 ~ /^-/{ c += $5; i++ }END{ print c / i / 1024"Kb"}' <(ls -la)
 
Old 04-04-2010, 10:11 AM   #7
GrapefruiTgirl
Guru
 
Registered: Dec 2006
Location: underground
Distribution: Slackware64
Posts: 7,594

Rep: Reputation: 550Reputation: 550Reputation: 550Reputation: 550Reputation: 550Reputation: 550
I just re-verified that my method works fine when filename(s) contain spaces. What happened for you grail?

Code:
sasha@reactor: ls  
dzonky.sh*  hello\ there  i3_get-output-resolutions.pl*  i3_workspacetags.awk*  jsonparse*  show-i3-keys.sh*  testfile

sasha@reactor: NUMFILES=$(find . -maxdepth 1 -type f | wc -l)
sasha@reactor: TOTAL=$((0))
sasha@reactor: 
sasha@reactor: for EACH in $(find . -type f -maxdepth 1 | ls -la | awk '{print $5}'); do
>  TOTAL=$((TOTAL+EACH))
> done
sasha@reactor: 
sasha@reactor: AVERAGE=$(echo "scale = 2; $TOTAL / $NUMFILES / 1000" | bc -l)
sasha@reactor: 
sasha@reactor: echo "Average file size: $AVERAGE Kbytes."
Average file size: 7.25 Kbytes.

sasha@reactor:
 
Old 04-05-2010, 12:47 AM   #8
grail
Guru
 
Registered: Sep 2009
Location: Perth
Distribution: Manjaro
Posts: 7,631

Rep: Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958Reputation: 1958
Sorry ... my bad, didn't look at message properly, was just silly warning about maxdepth.
Although I have now run it again and found our outputs differ quite dramatically and it seems to behave oddly :$

So the directory I am running on has 11 files and 3 directories (including . & ..)

NUMFILES correctly records 11

But the for loop has 82 iterations??? and at the end has a total of 611878 where mine (and calculator) shows 13947542

Then obviously the math comes out different for the average

Edit: <DOH> just figured it out ... the following line when run on its own does the ls -la of where I am and not from the find info:

Quote:
find . -type f -maxdepth 1 | ls -la
I had altered the find to be a directory not .
So the find is not required at all

Last edited by grail; 04-05-2010 at 12:52 AM. Reason: Not having patience
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Solaris 8 - file size and partition size discrepancy noir911 Solaris / OpenSolaris 1 02-10-2009 05:43 PM
Display by File Name, File Size, and File Owner using ls akeenabawa Linux - Newbie 9 08-15-2008 03:21 PM
any ideas to reduce log file size or make log file size managed? George2 Programming 2 08-13-2006 07:55 AM
ipcs -lm shows : max total shared memory (kbytes) = 0 vicv Linux - Software 1 09-01-2005 11:58 AM
file system size larger than fysical size:superblock or partition table corrupt klizon Linux - General 0 06-18-2004 05:18 PM


All times are GMT -5. The time now is 05:04 PM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration