Small script optimization
Hello everyone,
I have a simple script that searches for folders below the base parameter and prints the quantity of files inside them. The problem is that I have around 900.000 files in some folders and this process is very slow. Code:
#!/bin/bash Thanks |
Here's another approach that might be faster:
Code:
#!/bin/bash Note 2: The deletion of the temporary data base file is commented out because you might find the locate command useful for other reasons, and you might, therefore, want to keep it around. The creation of the db file in /tmp is, of course, arbitrary. It could be placed anywhere you wanted it, although placing in the the tree to wanted to count might be counter-productive. <edit> Here's a version that worked for me: Code:
#!/bin/bash |
If that isn't fast enough, consider Perl. It calls the underlying C libs directly and runs almost as fast as C (its compiled on the fly before being run). Should be much quicker than calling shell level programs.
|
All times are GMT -5. The time now is 02:20 PM. |