I use SuSE (but Ubuntu is probably just as easy), Samba and rsync to backu up data between machines.
Quote:
Originally Posted by zephyrcat
- Automatic backup of documents and photos
|
I use rsync as a cron job. I just do daily backups but if you look at the rsync resources there are ways to do more frequent backups as well as keeping history without blowing storage space.
Quote:
Originally Posted by zephyrcat
- Direct wired connection to one (or more) computers for fast sending of HUGE files for editing
|
I use Gigabit between main machines - I had to get latest drivers and do a bit of tweaking with buffer sizes - and now get between 300 and 450 Mbits/sec (limited by PCI and hard disk performance). No issues noticed with large files or with Samba.
Quote:
Originally Posted by zephyrcat
- Mirroring files to separate hard drives (not with RAID 1, because I don't want to have to have all the hard drives be equal sizes)
|
rsync and cron can schedule backups to wherever, however, not true mirroring. Depends how real time you need it. There are Linux tools to notify you when something changes in a filesystem (can't remember what but I know they exist) so you could use that to trigger rsync.
Quote:
Originally Posted by zephyrcat
- Hiding some of those drives to other computers
|
I use rsync to make backups to a separate drive and then Samba to publish a read only view of the back up data. Samba can control access to data independently of core file/directory permissions.
Quote:
Originally Posted by zephyrcat
- Reliability!!!
|
I'm still running SuSE 10.0 as my production server and it's fair to say that the only unreliability problems have been when I've decided to tinker with it! In terms of coping with either large numbers of files or large files there have been performance issues reported with filing systems and Samba but I haven't experienced them myself. Worth looking at what other people say on choice of filesystem (I'm now with ext3 rather than ReiserFS but have heard that XFS is better suited to large files).