Actually I don't understand your problem with sed. An example:
Code:
$ echo "line 1\nline 2\nline 3" > theFile.txt |
Flesym did a good suggestion with using two files ... but:
to delete from line 1 until delimiter \n (i think): Code:
sed -i '1,/\\n$/d' theFile.txt |
[double Post, sorry]
|
Quote:
|
Quote:
|
true, sorry :D It tries to find the largest hit so that would be the entire file ...
|
ya gud thing i tried it on a prototype and not on my actual thing :D
|
Why not use logrotate(8) ?
|
Quote:
|
logrotate does anything you want, it won't delete anything unless you tell it to. Typically you move the files to another dir eg an archive dir & gzip them.
|
finally i have settled in this concept :
Quote:
|
I wouldn't wait with the writing until the buffer is full. This may cause that you will loose the last 200kb during a crash. There is nothing wrong with writing some text to the end of a file. At least as long this won't happen several thousand times per second. The rest sounds good.
|
ya the problem is my system is generating 1000s of msgs at a short time so i prefer the buffer approach rather than writing each time to the file . also i plan to have some code tat writes off the data in the buffer to a file before the crash.
|
If you can predict the crash like that, you can probably prevent it ....
|
thatsy i have a small buffer size so tat the data lost wont be much.
|
All times are GMT -5. The time now is 03:19 PM. |