LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 04-13-2009, 04:47 AM   #16
hi_irf
LQ Newbie
 
Registered: Apr 2007
Posts: 15

Rep: Reputation: 0

Hi,

colucix thanks for your valueable input. As you said find command will search entire directory and subdirectory.
I am trying to make bakup of image server and my root directory having 10 subdiretory and those subdirectory have million of images. So, when i search to find images uploaded within last 24 hours to copy them to backup server. find command took lot of time.

Also once find command file found images of last 24 hours i append those results in text file and use that text file to copy those images to backup server. I am running my bash script once a day through cronjob.

My question is that approach is OK or is there any better solution for this ?
 
Old 04-13-2009, 08:14 AM   #17
Sergei Steshenko
Senior Member
 
Registered: May 2005
Posts: 4,481

Rep: Reputation: 454Reputation: 454Reputation: 454Reputation: 454Reputation: 454
Quote:
Originally Posted by hi_irf View Post
Hi,

colucix thanks for your valueable input. As you said find command will search entire directory and subdirectory.
I am trying to make bakup of image server and my root directory having 10 subdiretory and those subdirectory have million of images. So, when i search to find images uploaded within last 24 hours to copy them to backup server. find command took lot of time.

Also once find command file found images of last 24 hours i append those results in text file and use that text file to copy those images to backup server. I am running my bash script once a day through cronjob.

My question is that approach is OK or is there any better solution for this ?
I think there is a whole bunch of backup programs out there with, among other things, incremental backup - that's what you are trying to do, and your approach seems OK.

I guess though that backup programs have also take other needs into consideration - the ones we mere mortals haven't thought of.
 
Old 05-15-2009, 03:01 AM   #18
hi_irf
LQ Newbie
 
Registered: Apr 2007
Posts: 15

Rep: Reputation: 0
Hi,

I have following lines in my text file. File contains more than 20 million lines. like

/home/apache/ht/images/5/1/200904021159351.jpg
/home/apache/ht/images/5/1/20090401113951.jpg
/home/apache/ht/images/5/1/200904041728451.jpg
/home/apache/ht/images/9/5/2561195/2561195-20090402015241.xls
/home/apache/ht/images/attachments/9/5/2561195/2561195-20090407011002.xls
/home/apache/ht/images/attachments/9/5/2561195/2561195-20090406090748.xls


Now i want to remove filename from entire file. File should look like this


/home/apache/ht/images/5/1/
/home/apache/ht/images/5/1/
/home/apache/ht/images/5/1/
/home/apache/ht/images/9/5/2561195/
/home/apache/ht/images/attachments/9/5/2561195/
/home/apache/ht/images/attachments/9/5/2561195/

For that following command works for me

awk -F"/" '{$NF=""}1' OFS="/" file

How about if i want to remove entire path except file name like

/home/apache/ht/images/5/1/200904021159351.jpg
/home/apache/ht/images/5/1/20090401113951.jpg
/home/apache/ht/images/5/1/200904041728451.jpg
/home/apache/ht/images/9/5/2561195/2561195-20090402015241.xls
/home/apache/ht/images/attachments/9/5/2561195/2561195-20090407011002.xls
/home/apache/ht/images/attachments/9/5/2561195/2561195-20090406090748.xls

Output should look like this

200904021159351.jpg
20090401113951.jpg
200904041728451.jpg
2561195-20090402015241.xls
2561195-20090407011002.xls
2561195-20090406090748.xls

Thanks in advance
 
Old 05-15-2009, 03:06 AM   #19
ghostdog74
Senior Member
 
Registered: Aug 2006
Posts: 2,697
Blog Entries: 5

Rep: Reputation: 244Reputation: 244Reputation: 244
then just print the last field. I leave you to find out what's the last field.
 
Old 05-15-2009, 08:42 AM   #20
schneidz
LQ Guru
 
Registered: May 2005
Location: boston, usa
Distribution: fedora-30
Posts: 5,289

Rep: Reputation: 916Reputation: 916Reputation: 916Reputation: 916Reputation: 916Reputation: 916Reputation: 916Reputation: 916
^ um, isnt this exactly what basename was created for ?

Quote:
Originally Posted by schneidz View Post
would basename work
man basename

then maybe you could sed s/`the return of basename`/""/ in a loop ?
 
Old 05-16-2009, 12:38 AM   #21
hi_irf
LQ Newbie
 
Registered: Apr 2007
Posts: 15

Rep: Reputation: 0
Ghost,

I hope you are doing great. I got the answer. I tried following command

awk -F"/" '{print $NF}' OFS="/" filename

where {print $NF} prints the last field of file. Anyways Thanks for you help.

Cheers,
 
Old 05-16-2009, 01:48 AM   #22
jschiwal
LQ Guru
 
Registered: Aug 2001
Location: Fargo, ND
Distribution: SuSE AMD64
Posts: 15,733

Rep: Reputation: 678Reputation: 678Reputation: 678Reputation: 678Reputation: 678Reputation: 678
You were asking about restricting find searches to more recent files. Since the timestamp of the directory is also changed when you save a file, you could first use find to locate directories with newer timestamps. A "-type d" will restrict the results for only directories. You can use "-maxdepth <num>" to restrict how deep to search.

Then the results of the directory search can be used as the directory(ies) argument for the next sed command:
find $(find /path/to/directory -maxdepth 1 -type f -ctime <#days>) -type f -ctime <#days>

If this list of files is produced by the file command, you could instead have the find command print out only the path name. Look in the info manual for "printf". You can run the results through sort (if it isn't sorted already) and unique.
However, you may be able to simply have find search for recent directories, if you want a list of directories that have been modified recently (by saving or deleting a file inside).
 
Old 05-16-2009, 07:34 AM   #23
hi_irf
LQ Newbie
 
Registered: Apr 2007
Posts: 15

Rep: Reputation: 0
Hi,

Most of the time i have to find files based on creation date of file. Overall i have to find files just on the date is created. Let's suppose i have to find files in /home drive created in one week. I usually do

find /home -name -mtime -8 -type f

From the find command manual i found that it supports atime, ctime, mtime where

atime is access time. It is when the file was last read.
ctime is change time.
mtime is modification time. It is when file was modified.

So my question is there any way that i can find out files based on create time. Is the mtime works for same. Or is there any other command.

Thanks
 
Old 05-17-2009, 02:27 PM   #24
schneidz
LQ Guru
 
Registered: May 2005
Location: boston, usa
Distribution: fedora-30
Posts: 5,289

Rep: Reputation: 916Reputation: 916Reputation: 916Reputation: 916Reputation: 916Reputation: 916Reputation: 916Reputation: 916
^ whats the difference between change time and modification time ?
 
Old 05-17-2009, 07:50 PM   #25
Telemachos
Member
 
Registered: May 2007
Distribution: Debian
Posts: 754

Rep: Reputation: 60
Quote:
Originally Posted by schneidz View Post
^ whats the difference between change time and modification time ?
And Google says, http://www.brandonhutchinson.com/ctime_atime_mtime.html
 
Old 05-18-2009, 01:18 AM   #26
hi_irf
LQ Newbie
 
Registered: Apr 2007
Posts: 15

Rep: Reputation: 0
The same thing i want to know. Again i just want to search files created in 2 days or specific number of days. So i have to search through mtime or ctime ?
 
Old 05-18-2009, 06:46 AM   #27
Telemachos
Member
 
Registered: May 2007
Distribution: Debian
Posts: 754

Rep: Reputation: 60
Quote:
Originally Posted by hi_irf View Post
The same thing i want to know. Again i just want to search files created in 2 days or specific number of days. So i have to search through mtime or ctime ?
Check the link I posted above. In a nutshell, none of these times actually tracks creation time since they can be changed by a variety of things after the file's creation.
 
Old 05-20-2009, 06:04 AM   #28
hi_irf
LQ Newbie
 
Registered: Apr 2007
Posts: 15

Rep: Reputation: 0
Smile

Hi,

At the end of the day, i comeup with the conclusion through postings of different users that find atime, ctime, mtime nothing work correctly in my scenario.

So can someone help me to get listing of files which are created within last 24 hours in different sub directories. So in can use those paths(filenames) to copy those incremental files on backup server. Like

May 19 /root/file1.txt
May 20 /root/file2.txt
May 20 /root/file3.txt

I need todays file (created in last 24 hours). What command should i use to list files

May 20 /root/file2.txt
May 20 /root/file3.txt

Earlier than this i am using this

find . -mtime -type f --print > listing.txt


Cheers,
 
Old 05-21-2009, 07:22 AM   #29
chrism01
LQ Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Centos 7.7 (?), Centos 8.1
Posts: 17,790

Rep: Reputation: 2538Reputation: 2538Reputation: 2538Reputation: 2538Reputation: 2538Reputation: 2538Reputation: 2538Reputation: 2538Reputation: 2538Reputation: 2538Reputation: 2538
Why don't you just use rsync, which will find only changed/new files and will only txmit the changes (ie doesn't need to txmit whole file) or new files.
 
Old 05-22-2009, 08:29 AM   #30
jschiwal
LQ Guru
 
Registered: Aug 2001
Location: Fargo, ND
Distribution: SuSE AMD64
Posts: 15,733

Rep: Reputation: 678Reputation: 678Reputation: 678Reputation: 678Reputation: 678Reputation: 678
Another option is to use tar with a timestamp. It will only backup files that have changed. You can pipe tar through ssh and extract the files in another tar command, or you can have the backup directory mounted. Using ssh works best if you use public key authentication.

example:
tar -C /path/to/dir -g timestampfile -cf - . | ssh user@host tar -C /path/to/backup/dir -xf -

example
tar -C /pat/to/dir -g timestamp -f - . | tar -C /path/to/mounted/backup/dir -xvf - >logfile

See the info tar manual Incremental Dumps section for details.

---

Of course, you could create tar backups instead of using tar to copy files.

Last edited by jschiwal; 05-22-2009 at 08:30 AM.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
python: converting a 3 character octal string into and ASCII char llama_meme Programming 1 07-06-2010 02:00 PM
Picking a character from a string randomly swatward Programming 2 08-14-2005 01:21 AM
how to print the first character in a string using strtok its_godzilla Programming 5 02-02-2005 10:22 AM
how to solve Character and String problems on linux?(screenshot) tradingbr Linux - General 0 04-29-2004 02:51 PM
Using sed to convert a string to a character? whansard Linux - General 2 01-10-2003 05:13 AM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 02:19 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration