On the assumption that (for whatever reason) tar doesn't like untarring 1200mb, why not try extracting smaller amounts of files at a time. You should be able to find all the info you're looking for by using `man tar` (which is what I'm working from, so I don't know if this'll work!). But basically, you should be able to run:
Code:
tar xfj myarchive.tgz a_file_in_myarchive
or
Code:
tar xfj myarchive.tgz a_directory_in_myarchive
And it'll pull out, either a single file, or a single directory.
Alternatively, you can queue them up:
Code:
tar xfj myarchive.tgz fileone, filetwo, filethree, fileseven directoryone, directorythree, directoryeleven
Some gratuitous scripting with judicious amounts of awk will let you extract 10, 20, 400 files at a time. It'll probably take longer than doing it all at once, but it should work.
Of course, extracting all at once will cause it to fail on finding a corrupt file and then exit, leaving you with 700mb of stuff still in the archive. If you extract one file at a time, then you may find that one file is corrupt, but the rest after it are not: by doing it one at a time, only that iteration will fail, the rest will continue as normal.
Let me know if you need a hand with that script, or if you find a better solution!
- Piete.