copy last 10000+ lines of large text file to a temporary file
I need to periodically copy the last 81424 lines of a large (> 5GB) text file that is an output of a program to a temp file. The text file is not arranged in columns, but it does have a repetitive structure (repeated every 81424 lines). The program is continually writing new data to the text file. I need to scan this data at very small time intervals (less than every 15 sec.) to stop the program as soon as a certain condition is met in the data.
I have tried the following, but I seem to not be able to tail more than 10000 lines at a time:
tail -n 81424 f.txt > ftemp.txt
I also tried these things but they take too much wall clock time (4min+)
1. counting the number of lines in the file, using `cat f.txt | wc -l` and then using head and tail in a pipeline to print out the last 81424 lines of the file (lines #totallines-81424-1 to #totallines).
2. using sed from the infamous 'sed one-liners' to output the last 81424 lines:
sed -e :a -e '$q;N;81425,$D;ba' f.txt
Any suggestions? Is there a way to change the limit on the number of lines that can be output using tail? Can I copy the last X-sized chunk (in bytes or something) to another file somehow?