ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I may be overthinking this but what I think should be so simple I am running into roadblocks. I will say I am still a newb when it comes to create bash scripts but here is what I got and what I am trying to accomplish.
I have a directory with subdirectories and within each subdirectory there are files, docx Word documents which will soon be PDFs. I wasn't sure if there was an easy way to convert the docx to pdf or if it was easier to just save the files as the PDF. But my primary concern is to take the files, newest dated, which should be Friday and move them to a new location. If there is a file already there move it to an archive directory and delete the file in the archive directory if one already exists. I have a Windows share mounted in /opt/reports which is the primary location where everyone saves their reports to. On the Linux (SLES) server running Apache I want to take their Friday's reports and move them to /srv/www/htdocs/reports. The file currently in /srv/www/htdocs/reports I want to move to /srv/www/htdocs/reports/archive and delete the file already in the archive directory. The goal is to make these reports viewable from the webserver while keeping last weeks reports if someone still wants to view them.
Windows Server Structure
Reports ----> Manager 1--->manager_date_reports.docx
-----> Manager 2--->manager_date_reports.docx
-----> Manager 3--->manager_date_reports.docx
----->Manager 4--->manager_date_reports.docx
Apache Server
reports--->manager_date_reports.docx (all 4)
archive--->manager_date_reports.docx (all 4 from the previous week)
I hope that is somewhat clear but I just want to take the latest file from the Windows server from within each directory and copy it to the web server. Before that remove the archive files within the archive directory and move the files in the reports directory on the webserver to the archive directory. There should really only be 2 files from each manager. Current week in the reports and last week in the archive at any given time.
I think this may be easier. The files within the directories will be saved as manager <date>.pdf. So, it should be a matter of taking all the *.pdf files and moving them from reports directory to the archive directory removing the <date> from the file name. Which would just leave the manager.pdf file.
Wondering if it would be easier to take the manager <date>.pdf and when I copy them to the new location just remove the date so it is manager.pdf. Then just remove the files in the archive folder, move the ones currently in the reports directory to the archive directory. Finally, copy the ones from the Windows server manager <date>.pdf and strip the date so in the reports they would just be manager.pdf.
Not sure if you've started with a bash script yet, but if you do have one, you should post what you have. I'd also break down the functions which you want to do. Mainly because it seems that the top level process is varying as you learn more or make different decisions as to what you want to do with the final result, or even as you impart different interface rules upon the users of this.
One suggestion is to copy a small part of the existing directory tree to somewhere else and test your draft scripts against it before you're ready to go into the actual directory of reports with the script. Further, the webserver part IMHO is separate from the script which would make decisions to move, or supercede any files as well as deal with testing to determine if given report files are duplicates.
Logically it sounds as if you just want to check for the latest file, detect any older ones, archive all of them, but don't overwrite existing archives, and then delete older ones once you know they have been archived properly.
You can use the find command to locate files which have been modified on or after a certain date. I had to do a quick search because I rarely search by date, here's one quick find; kudos to the poster, http://www.cyberciti.biz/faq/howto-f...files-by-date/. I like the gimic using the -newer option I have used that before. You create a placebo file, use touch to set it for a highly specific date and time and then search for all "-type f" (files) which are newer than the temporary file you created and dated.
I have tried the bash script route but it didn't turn out well. I have no moved on to rsync -avzh /directory /newdirectory. This gives me all the files and directory structure which is good. Now I am just trying to figure out how to remove all the files within the directories that are older than 2 weeks, since these files get populated weekly on Friday I just want the last Friday and the Friday before. Renaming the files are also challenging. I can't figure out how to change the name from "manager 2014-09-26.docx" to just manager.docx within that manager's directory.
Well as far as remove files beyond a certain date you can use forms of the find command to locate those files; once again based on their dates, and then extend the find command using -exec to remove those files:
Code:
find <directory> -type f <qualifiers like -name or -newer and string> -exec rm -f {} \;
# That being general is hard to read, an example is if I wanted to find all .pdf files and remove them
find /reports-directory -type f -name "*.pdf" -exec rm -f {} \;
And once again, you can use find to change the file names, with the caveat that if there are more than one file of those types in the same directory, then there will be problems:
But that seems to be more useful to be run as part of a script to determine if there are more than one file in any given location. Another reason why that's best as a script is say instead of renaming in place you also move them somewhere else; this could potentially cause you to find them again in their newly renamed form and then cause the find command to act upon them again. Just inefficient, there are smarter ways to do that.
Again if you post your script attempts, people can offer some suggestions, but they won't typically just offer a script based on a request, unless it turns out to be a one-liner or something.
My best suggestion also is to add "set -xv" just after your #!/bin/bash line so that you enable debug. Therefore when you run your script you'll see debug as it runs.
The added thing here is you're trying to do "moderate" stuff and automate it, therefore a script is the best choice here. It would be one thing if you just wanted to manually change one thing occasionally, but you're talking about periodically, recurring things. Therefore a script and also a way to cause that script to run also periodically. The running periodically is a follow-on enhancement; such as a cron job. First you should to get things automated to your satisfaction.
Remember also that all a bash script is, is a bunch of commands that are pretty much exactly what you could type into a terminal. The main exception is that you run the script instead of typing a bunch of commands.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.