Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
|
01-25-2014, 12:06 PM
|
#1
|
Member
Registered: May 2011
Location: Netherlands
Distribution: Debian, Archlinux
Posts: 268
Rep:
|
Space saving filesystem suggestions, please?
There's a spare hard disk I have at home. I'd like to use it for storing mainly anime episodes and streaming them to a Raspberry Pi.
Would be nice, if I could compress it, as I don't care for computational overhead, or random access.
I looked into btrfs, because of compression. But I then learned about deduplication. I think deduplication would be ideal for storing anime series.
I haven't found any FUSE solutions. But I can imagine something that basically writes a giant compressed archive to the disk might be effective. (And effectively "deduplicate".)
Also, it would only be necessary to present the files as normally uncompressed files to other computers.
A problem is, that I'm currently running Debian Squeeze (old-stable). I've wanted to try Arch, or Debian Sid, so this compression idea might be a good reason as any to do so. I enjoy making weird shizzle work, so I'm willing to do some experimenting and learn.
Could anyone nudge me into the right direction, please?
|
|
|
01-25-2014, 02:16 PM
|
#2
|
LQ Guru
Registered: Feb 2003
Location: Virginia, USA
Distribution: Debian 12
Posts: 8,370
|
Here is a paper in Linux Journal which give the results from several compression programs used on different types of data.
http://www.linuxjournal.com/article/8051
For myself, disks are so cheap that I just buy more disks rather than using compression.
---------------------
Steve Stites
Last edited by jailbait; 01-25-2014 at 02:18 PM.
|
|
|
01-27-2014, 05:51 PM
|
#3
|
Moderator
Registered: Mar 2008
Posts: 22,228
|
Many modern video formats will not compress well. I kind of doubt you will save much space. I forget what the current best compression method is.
Btrfs and zfs have the ability to compress data on the fly as well as maybe one or two others.
Another way might be to use some on the fly compression. Fuse, or squashfs or aufs maybe.
|
|
|
01-27-2014, 10:08 PM
|
#4
|
Moderator
Registered: Dec 2009
Location: Germany
Distribution: Whatever fits the task best
Posts: 17,148
|
As jefro already pointed out, usually video files are already compressed and will not further compress. They will even get larger if you try that, due to the added overhead. You will possibly save space if you convert them to a different compression format, for example if they are MPEg-2 encoded you might convert them to H.264, but this will also result in a decreased quality (re-encoding will always cause qulaity loss)
|
|
|
01-28-2014, 01:43 AM
|
#5
|
Member
Registered: May 2011
Location: Netherlands
Distribution: Debian, Archlinux
Posts: 268
Original Poster
Rep:
|
Yes, that is true. But the main problem with compressed file-systems seems to be they perform per file compression. But if you have a collection of files with identical parts, disk block deduplication, or solid block compression might work.
The problem with compressed archives seems to be the dictionary is based on a "local window"; it might not look back far enough to identify repetition across large files. But this is just from very shallow research into the matter.
Per block deduplication, as opposed to per file, could save space, and I believe significantly. Mainly because the opening sequences of anime.
I'll look into zfs and the exact deduplication methods of btrfs.
Last edited by Weapon S; 01-28-2014 at 01:58 AM.
|
|
|
03-18-2014, 03:35 AM
|
#6
|
Member
Registered: May 2011
Location: Netherlands
Distribution: Debian, Archlinux
Posts: 268
Original Poster
Rep:
|
I was getting ready to dig into the source, but btrfs has updated their documentation to clearly state bedup provides per file deduplication and not per block.
Also I remembered that a block compressed archive has horrendous random access.
This means that as of yet there are no space saving solutions for my case.
This probably makes me look more naive, but I'm convinced a per-block deduplicating open-source file system would be a great idea.
|
|
|
03-18-2014, 04:48 AM
|
#7
|
Member
Registered: Sep 2008
Location: The Netherlands
Distribution: Slackware64 current
Posts: 594
Rep:
|
|
|
1 members found this post helpful.
|
All times are GMT -5. The time now is 08:09 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|