ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
is ftruncate used to truncate the file at the end only ?? i need to remove/truncate the top of my file as i keep appedning content at the bottom , how do i do it ?
No, you cannot remove a random line from a file. You can only append to or truncate the end, as well as overwriting existing data. If you want to remove/insert data at a random position, you have to copy the file into a buffer, modify that buffer and copy it back to the file. No way around, sorry.
What I said above is also true for the first line. But it really isn't very hard to implement a short function that copies the file into a buffer, removes the first line and copy it back. Another way would be to use existing (higher level) functions to strip that line away. An example that uses the *NIX command 'sed' would be:
Code:
sed -i '1d' theFile.txt
In C you can call this command through the 'system()' function:
hey thanx a lot , this is sort of thing i wanted now nearly my prob is solved, am able to remove the first line and do insertions at the end but the owner of the file has used the "NEWLINE" character as the delimiter for each message that is to be stored, so is it possivle to delete based on delimiter??
It can be done with a linked list with lines as elements. You may keep adding data to the end of the file and update a pointer to the last byte of data in the file to discard. Then you may use a combination of mmap() and memmove() to move this data to the top and ftruncate() it.
matter is my file size is 1MB so when it reaches tat limit i got to copy them up to a buffer and do the above said mmap memmove and remove first record and use msync to write it back to file, now for every write i got to repeat this 1MB copy to buffer, record removal and write back to file - it sounds a bit too tedious isnt it ?
matter is my file size is 1MB so when it reaches tat limit i got to copy them up to a buffer and do the above said mmap memmove and remove first record and use msync to write it back to file, now for every write i got to repeat this 1MB copy to buffer, record removal and write back to file - it sounds a bit too tedious isnt it ?
With careful coding, only one mmap() call is needed. With 1MB, why a file is needed anyway ?
is it possible to have the '\n' delimiter to remove the content till the delimiter using this sed ? or any other system call ??
What are you trying to achieve then? Sed handles files per line, so if you remove till the delimiter, you're removing everything on a line except the \n which doesn't make sense to me. It looks like you want to search for empty lines: ^$
Also check out some sed links: http://bookmarks.linuxquestions.org/.../one-liner+sed
those are actually log files each of size 1MB, and i need to keep logging my error msgs from my different processes and i must be able to retain around 1 MB of data all the time, so as i keep adding i need to delete the old messages so tat there is 1MB of data all the time.tats y now immap this 1MB and delete the first record by memove and then use msync to write it back to the file, matter is i got to do this procedure everytime a message comes in after reaching the 1MB limit, am i anywhere near the solution ?? any other alternatives ?
my other doubt is : now tat my file reaches its full size of 1MB if i use
Quote:
system("sed -i '\n d' filename.txt")
each line will be deleted right ?? does tat mean - when first record is deleted the second one becomes first and then the new record cud be inserted at the end ? or will the place of the deleted record be occupied by a space ?
No, this isn't a valid sed command. Have a look at its man-page or the link muha posted to learn how to use it. To delete the first line, use the command I already posted; it will delete to the first '\n' (including):
Code:
sed -i '1d' theFile.txt
So when your file reaches 1MB call this command, then reopen the the file for appending and write the next record to its end. But: This method should only be used, if a new log entry is a relatively rare situation, because this is a quite expensive method (sed has to buffer the whole file every time).
Another more efficient way would be to use two (or even more) files: Begin by filling the first file until it reaches 1MB; then open the second file and keep on writing to this until it is full. Now blank the first file again and log to this for the next 1MB and so on... This way you can review at least your desired last 1MB and won't have to worry about performance, because you only append and truncate (both are cheap file operations), so there is no buffer and no additional string operation needed. But of course this has to fit into your software design.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.