-   Linux - Newbie (
-   -   Help for automate script for a webcam (

skifun 01-09-2010 06:56 AM

Help for automate script for a webcam
Hello there,

recently I bought a Dlink DCS-2121 web camera in order to upload a snapshot at web server. This damned cam has not an option (I changed lot of firmwares) to create an single only file and overwriting this one, but continuously uploading in directories with random (can not understand where they comes) names, snapshots with filenane based on time, date.

So I am trying to create a cron job in my cpanel menu to find the newest file from those directories , every 2 minutes let's say, and copy this file to a new location. Have searched the forum and found some solutions but I am an experienced user to combine them.

This command

ls -lrt|grep test|tail -1|awk '{print ""$8}'
seems to find the newest file "test" for example, but I can not modify it to search in all subdirectories and input probably an -exec (or cp) command to copy this file to a new destination and all these in one command.

Thanks for your patience, any help could highly appreciate.

Simon Bridge 01-09-2010 07:28 AM

Your problem is that ls won't list recursively down the directory tree.
You need to use a search function instead, like find, to generate your list of files.
Alternatively, use grep directly.

How are you capturing the photos?

Can you provide an example of the filenames (just ls and copy them over) - one of us may see the pattern and that will simplify your search routine considerably.

I have a feeling you are doing this by a more convoluted method than is needed.

skifun 01-09-2010 12:11 PM

Thanks for reply

I already have read the article you propose and even lot's more. I said I am not experienced in Linux not in computers :) (although it could be a long conversation for this statement...) I have tried also with firmware upgrade or even downgrade but simply there is not anywhere the option writing on a single file.
Here it is the options it provides.

Yes I know that ls is only for current directory but as I said I am not a experienced user. I will try nevertheless to find some solution, and that's why I am asking some help.

There is also another way, with direct url with the help of dlinkddns ( but it's a bad idea exposing your password over the net.

The filename could be for today _20100109_185516.jpg or If I used a prefix test for example should be test_20100109_185516.jpg. I tried just a few minutes ago and wrote on web server this folder
with images _20100109_190452.jpg and _20100109_190752.jpg (180 sec is the Interval)

What else should I try instead of searching complicated commands ?

GrapefruiTgirl 01-09-2010 12:37 PM


shell# cp -f -T $(find /<searchpath> -name "*" | sort | tail -n1) /<destination-file>
The above is an idea, but not sure if is exactly what you want-- I'm having a slight bit of trouble understanding the situation.

This code finds the latest file (based on numerical sort) in the <searchpath> recursively, and copies it to <destination>

Note that there's not really any sanity checking or error checking, but maybe it's a start.


skifun 01-10-2010 04:40 PM

Yes !

that's what I was looking for, and it plays well.
Added the following line in my cpanel cron job and everything is absolutely ok, bypassing firmware's stupidity.


cp -f -T $(find /home/username/public_html/directory -name filename* | sort | tail -n1) /home/username/public_html/filename
Thank you very much GrapefruiTgirl ;)

GrapefruiTgirl 01-10-2010 04:43 PM

Great! Glad it helped.

As noted though, you may want to throw in some error checking to use it reliably on a production environment.


All times are GMT -5. The time now is 12:30 PM.