Linux - EnterpriseThis forum is for all items relating to using Linux in the Enterprise.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I work for a medium sized hospital with several remote locations. We are currently running various windows based servers.
The trouble is that our users are requiring more and more space for data and email archive storage. Administration does not want to use quotas, (as they are our biggest offenders). This is getting expensive and we are looking for solutions.
Is there software out there that will allow us to manage a file server in such a way that newer data is readily accessible to users, but older data can be stored on a separate NAS device, possibly compressed? Of course, this will need to be transparent to the end users. Soft links?
Throw in a few more details. Exactly hos can we discriminate between 'new' & 'old'?
If New & old can be determined, e.g. by file type, a VB script might do it. Otherwise just dump administration on the network server. It's not vital medical data.
What we had in mind was to take files that had not been accessed or modified in the past say, 6 months, and move them from the file server to another storage device. The idea being that if they have not needed them, we could tuck them away on a device with bigger slower hard drives.
We could do this manually with robocopy, but it would mean that our users would have to look in two locations for their files, and that would never fly.
Symantec Enterprise Vault is supposed to do something like this, but it is not an option at this point in our fiscal cycle.
What we had in mind was to take files that had not been accessed or modified in the past say, 6 months, and move them from the file server to another storage device. The idea being that if they have not needed them, we could tuck them away on a device with bigger slower hard drives.
We could do this manually with robocopy, but it would mean that our users would have to look in two locations for their files, and that would never fly.
Symantec Enterprise Vault is supposed to do something like this, but it is not an option at this point in our fiscal cycle.
I know this will probably won't help...BUT...
We switched to Google Apps here; and it's been nothing but smooth sailing...
Google Docs, Calendar, Email, Chat, and 7 years of archiving for about $80 per user per year. Completely hosted solution...
Price comparison was this...
We were going to upgrade to Exchange 2010...including servers/licensing it was about $300K for it...
For Google Apps it was $20,000
Assuming it's a 5 year solution...
Exchange 2010 is still $300K
Google is $20,000x5 which is $100,000
Still cheaper
Just a thought...maybe a hosted solution would be better (dosen't have to be Google...just sharing our experience)?
It would be nice to switch to google apps or Open Office and not have to pay Microsoft Office licensing costs, but that has been tried in the past before I started here. They did not have enough support from higher administration at the time to have all users make the switch, (accounting in particular has/had tons of access databases they were not willing to convert) so it is now rejected out of hand.
Of of the things that I have learned working at a hospital is that everything is vital medical data. Apparently the Feds also require extensive backups to be available on request.
1. Locate old data > 6 months old
2. Create a transparent link
3. Move old data - keep folder structure
4. Compress old data (optional)
5. Work with our current shares and security groups
This is a linux forum, and linux solutions would be to read the man pages for find, xargs, etc. I'm nervous about shortcuts or symlinks on windows systems. Can you handle that end?. Then it seems you need (excuse syntax)
for dirs in i ; do
mkdir -p /some/where/else/far/away/$i && chmod appropriately
find -path $i -date [options] |xargs zip - && mv /some/else/far/away/$i
next i
If you run that, results are pretty unpredictable until you tidy it but you get the drift. Then simply email folks and tell them their old files are in their backup directory. And move _all_ of admin's stuff to the backup - it sounds like they need a reminder not to throw weight around ).
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.