It's quite normal that what I pointed to isn't what you need :-) That would be all to great wouldn't it? I only pointed to those to give you an idea on where to start. I also mentioned in previous post that you 'could' use several things like crontab, sleep, make it a daemon that constantly runs in the background, ... There are more than one way to get to what you need. You just need to take the decision.
I want to process the log file and save the results into my internal DB.
Which log file and when? I imagine you just want to start processing a log file (following it) from creation until death (when a new one gets created). Just write a script that monitors your directory for number of files with timestamps (creation/modification) less then X time. From the starting point of the script you'd get the latest log file and monitor it (process), then sleep for a second, check the directory again to see if the file has modified and if it is then check for newly added content to the latest log and process it (diff, comm). If a new log gets created then you automatically process this new file since you'll only have one log file that has recently been changed/added. If you sort the log files by modification time and always get the latest one, I guess that should take care of it, regardless of how many log files you have and what their names are. Or am I not understanding you (once again)?