Shrinking crazy sized video file
I have basically the same 1 hour videos in 2 files. One is ~4G each, the other is ~400M. Now there's issues with some of the ~400M, (10 of them), but I don't want to store 40G to replace ~4G, as you might imagine. I ran ffprobe on them and have the data.
Code:
bash-5.1$ cat file-big.txt |grep -A50 Metadata The bitrate leaped out at me as wildly different. What's an acceptable bitrate? Is there anything else I can change without junking quality? EDIT: The 'grep' bit just drops the compile-time options & libraries. |
First off, the resolutions are different. The top one is full HD - 1920x1080, the other is 720x402 which is roughly 9 times smaller, so that'll explain why the files are nearly 10 times smaller. Remember that bitrate is the rate that the data needs to be transferred to create the image, so, again, I'd expect the bitrate of the full HD stream to be roughly 10x
Quote:
For what it's worth, 720x402 is pretty close to 720x480 wich was the standard resolution for NTSC DVD (PAL, as used in Ireland/UK was 720x576) Any of these encodings is "lossy", if you can't tell the difference on your normal use case then keep the smaller ones. If you can see the difference then try re-encoding with different settings until you find a balance you're happy with. |
Thanks for the reply. I'm funny with quality. Having grown up in the days when tvs had a "405/625" switch, you got used to what you had. Yes, the smaller one is grainier. I have a 1920×1080@60Hz setup. But once I'm 5 minutes into something, I forget about the quality, unless it's really bad. I never felt the need for 4K, never mind 8K.
With that information, I can probably resize the big one to 1280×720 which will give me a fairly decent size reduction. The standard for PAL was 625 vertical lines but if you exclude the invisible lines and flyback time, you could be left with 576 lines. I left all that behind in the 1970s and went into research. Secam had even more vertical lines (819?). |
Quote:
I have downloaded and converted all the episodes of Supergirl which originally were ~4 GB each with an excessively high H264 video bitrate, often more than 12 Mbps. H264 is an ancient and incompetent codec, so I'm not surprised it needs such amounts of bitrate for quality image. I made them with H265 and bitrate ~4600 Kbps. The size dropped almost twice but the quality remained the same. https://i.imgur.com/cFdGlP6.png |
One other suggestion is ffmpeg. It can easily convert codecs, handles h264 easily, and can export it as h265 with the proper codecs installed on your system. Just my :twocents:
|
I was already planning that :D
|
Just to report back. Using
Code:
ffmpeg -i input.mkv -c:v libx265 -vtag hvc1 output.mp4 The conversion took 23 minutes using 850% - 1100% cpu. There is zero loss of quality, it seems in that conversion. Interestingly, ffmpeg only had one process using multiple threads, so it bespeaks some very elegant programming. Any optimizations welcome. |
Try '-preset veryslow'. It'll give you improved and "smoother" visual quality. Default is medium. Expense is more CPU time.
If you want to push shrinking it little bit more, try '-crf x'. By default, x is 23. The working range is from 0 to 51. 0 is practically lossless at expense of incurring more hard drive space. I usually use 1 or 2 when I need lossless. 51 is maximum lossy at a huge expense of visual quality. I usually stop at around 28 to 32 on the lossy side. Most of the time, default 23 is good enough for me. Maybe add in '-pix_fmt yuv420p' too for better media player compatibility. Code:
ffmpeg -i input.mkv -c:v libx265 -crf 23 -preset veryslow -pix_fmt yuv420p -vtag hvc1 output.mp4 To quickly test ffmpeg options, add in '-t 30 -ss 120' in between ffmpeg and -i. You'll get a 30 seconds sample two minutes into the video. |
Thanks, but that's quite small enough for me. In other news, I'm chopping other big videos (~2.1G per episode) back to ~500-600Megs with the same technique. Doing the job in less than 20 mins @ ≅900% cpu for a 1 hour video would be good. That would be flat out for 90 minutes on my old laptop, which would probably cook. For a laugh, I'll try one on a 1.5Ghz RazPi with 'time' running :D.
EDIT: Mine's one of the early Pi 4Bs. Later ones did 1.8Ghz and had an adjustable (cpu) PSU so you could specify an 'overvolts' setting in /boot/config.txt. I just don't know if mine handles that, and I'm not really an overclocker anyhow. |
Well that timing test on the RazPi is put on hold.
I installed a disk image on the RazPi which is the way to go. But it left me light on the development stuff. So after a few false starts, I decided to mirror the slarm64 packages from slackware.uk, and do a mass 'upgradepkg --install-new.' I already have the backup - just in case. This is to achieve timing tests, which are not in any way a priority. It's good to build have development stuff but far from essential on a RazPi. That libX265 is also a sizeable compiling job - 200 Megs of code! |
All times are GMT -5. The time now is 12:22 AM. |