Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have a situation where there are about 100K-150K files in folders divided by month. These are snapshot images that were captured by a Synology Surveillance Station. Problem is we have way too many files to work with as the system was capturing about every 10 seconds. The images are needed to ultimately create a time lapse sequence over about a 12-15 month period.
So I was trying to see if there might be someone who could help me to see if a command could be used to search these folders and move only files that are in 60 minutes intervals to a designated smaller folder? The captures though are not completely in 24 hours intervals per day. The setting was to capture during daylight hours like between 8am-6pm. Hope this makes sense.
Going forward, we reduced the capture time to avoid this problem but needing to deal with the older captures.
60 minutes intervals = 1 every 60min
Math required.
Have to get a start time on 'the first file in folder' then a patten of the 'two' files to know it is the same sequence then compare it against the 'first file'.
if their is a 60 minute gap between the two.
If yes move, (one or both) if no skip go to next 'batch' in same or different folder?
so I think it is going to take a little more then just one command.
Defiantly need more information on data that is being worked on, and how it is actually being stored, ie. same dir or dir/sub-dir or different dirs etc.
so my word problem I believe still stands to get files 60 minutes apart then move them somewhere else.
Yes, I see now. My example is relative to the current time, but the time offset does not have to be in round units. Even the offset can be made relative to the current time. But maybe that sets in the possibility of a kind of race condition for things.
I'm thinking that the easier way then to get absolute blocks of time might be with perl and CPAN's File::Find module.
Yes, I see now. My example is relative to the current time, but the time offset does not have to be in round units. Even the offset can be made relative to the current time. But maybe that sets in the possibility of a kind of race condition for things.
I'm thinking that the easier way then to get absolute blocks of time might be with perl and CPAN's File::Find module.
yes but still a basis for comparison needs to be set due to a whole bunch of files already created mixed within a bunch more that will not have the same intervals.
He has one bunch taken in 10 second intervals for over a years worth, then changed to every 60 minutes over a 24hr period non stop.
if they are all held within the same directory. does he want to go back to the very first photo taken then everyone after that every 60 minutes later (intervals) then save all of them and get rid of the rest?
or just pull out the ones that he started taking every 60 minutes and save them somewhere else, and do whatever to the others prior to the changing of the time photos are taken?
Thanks for the helpful info. I am trying to come up to speed on doing stuff on the command line. I can do most rudimentary stuff but in this case as the situation is a bit complicated, I realize I need help.
1. I had about 7 months of these images all sitting in one folder.
2. Through the find command using the mtime parameter, I managed to move then out into folders by month/30 days. There are 6 folders from June thru to Dec.
3. However, there are still way too many files in those folders.
4. Since this is for time lapse use, we only need snapshots on an hourly basis.
5. The snapshots were set to run from like 7am in the morning to 6pm. I could give you specific start and ending times if that would help.
6. So I don't mind doing a command or multi-command on each months folder to move out images that are about 60 minutes apart more or less from the beginning of the snapshot sequence.
Please let me know if I you need any further details from me.
You also have the -newerXY option in find which can be used to find an absolute date + time if used with "m" and "t" . Here files from 2017-05-20 are grouped by their hour of last modification:
Code:
d='2017-05-20';
for h in $(seq 0 1 23);
do
echo -e "\n $h\n";
find /tmp/X/ -type f \
-newermt "$d $h:00:00" \
\! -newermt "$d $h:59:59" \
-print;
done;
Is that how your files are grouped? Can the times to search for be predicted automatically?
So I have folders set aside by month. Inside those folders are a ton of images which were taken in 10 second intervals. I want to scan those folders and extract images that are spaced 60 minutes in time and put them in a different folder. The starting time and/or reference is not critical. These are going to be used in a time-lapse sequence over a 12-16 month period so the data by day/month is not critical in terms of those relative times. Hope that makes more sense.
There are not that many folders so if I can find a command that will work for a single month, I can then apply that command with slightly different variables to the others.
or look at post #3 and get an idea on how to write it with two loops then set it up to do one dir after the other. run it - go get a coffee or whatever come back when its all done, instead of having to re-run the script for each dir separately.
that for loop would work too.
The outside Loop:
Takes care of your (more than one) dirs.
you can have it switch to send the files to different directories when you change to the another search dir too, that is setup in the post #3 on how to do that.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.